Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tox should call pytest indirectly #8

Closed
wants to merge 1 commit into from
Closed

Conversation

mtelka
Copy link

@mtelka mtelka commented Oct 27, 2022

... to get better support for tox-current-env.

@jaraco
Copy link
Owner

jaraco commented Jun 24, 2023

I'm not familiar with tox-current-env.

I was able to replicate the presumed failure:

 backports.entry_points_selectable main @ pip-run tox tox-current-env -- -m tox --current-env
py: commands[0]> pytest
py: exit 2 (0.00 seconds) /Users/jaraco/code/jaraco/backports.entry_points_selectable> pytest
  py: FAIL code 2 (0.04=setup[0.03]+cmd[0.00] seconds)
  evaluation failed :( (0.07 seconds)

However, even with this PR, the test fails:

 backports.entry_points_selectable mtelka/main @ pip-run tox tox-current-env -- -m tox --current-env
py: commands[0]> python -m pytest
/Users/jaraco/code/jaraco/backports.entry_points_selectable/.tox/py/bin/python: No module named pytest
py: exit 1 (0.02 seconds) /Users/jaraco/code/jaraco/backports.entry_points_selectable> python -m pytest pid=59859
  py: FAIL code 1 (0.05=setup[0.02]+cmd[0.02] seconds)
  evaluation failed :( (0.08 seconds)

Most importantly, I'm not sure applying this change here is the right place. The code in question is derived from jaraco/skeleton, so if support for tox-current-env is important, we should consider applying it broadly to all packages. If it's only relevant to this project, we should be circumspect about applying it at all.

Please provide more background explaining why this change is valuable and consider filing a relevant issue to jaraco/skeleton explaining why it should be applied.

@mtelka
Copy link
Author

mtelka commented Jun 24, 2023

My scenario is a bit different:

$ ls -l /usr/bin/pytest{,-3*} /usr/bin/python{,3.[79]} /usr/bin/tox{,-3*}
lrwxrwxrwx 1 root root    10 Sep 15  2022 /usr/bin/pytest -> pytest-3.9
-r-xr-xr-x 1 root bin    223 Nov 12  2022 /usr/bin/pytest-3.7
-r-xr-xr-x 1 root bin    223 Nov 12  2022 /usr/bin/pytest-3.9
lrwxrwxrwx 1 root root     9 Sep 14  2022 /usr/bin/python -> python3.9
-r-xr-xr-x 2 root bin  16416 Feb 20 07:50 /usr/bin/python3.7
-r-xr-xr-x 1 root bin  16368 Feb 19 23:55 /usr/bin/python3.9
lrwxrwxrwx 1 root root     7 Oct 14  2022 /usr/bin/tox -> tox-3.9
-r-xr-xr-x 1 root bin    206 Jan 15 22:39 /usr/bin/tox-3.7
-r-xr-xr-x 1 root bin    206 Jan 15 22:39 /usr/bin/tox-3.9
$ cat tox.ini 
[testenv]
commands =
    -pytest
    -python -m pytest
$ /usr/bin/tox-3.7 --current-env --no-provision --recreate -e py37
ROOT: tox-gh-actions won't override envlist because tox is not running in GitHub Actions
py37: remove tox env folder /tmp/test/.tox/py37
py37: commands[0]> pytest
================================================================================================================ test session starts ================================================================================================================
platform sunos5 -- Python 3.9.16, pytest-7.4.0, pluggy-1.2.0
cachedir: .tox/py37/.pytest_cache
Using --randomly-seed=3895753261
benchmark: 4.0.0 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /tmp/test
plugins: mypy-0.10.3, time-machine-2.9.0, env-0.8.2, flake8-1.1.1, datadir-1.4.1, mock-3.11.1, console-scripts-1.4.1, socket-0.6.0, xdist-3.2.1, perf-0.12.0, forked-1.6.0, asyncio-0.21.0, cov-4.1.0, rerunfailures-11.1.2, randomly-3.12.0, lazy-fixture-0.6.3, backports.unittest-mock-1.5, typeguard-4.0.0, freezegun-0.4.2, timeout-2.0.2, hypothesis-6.79.1, subtests-0.10.0, mypy-plugins-1.11.1, pytest_freezer-0.4.8, black-0.3.12, regressions-2.4.2, black-multipy-1.0.1, kgb-7.1.1, teamcity-messages-1.32, checkdocs-2.9.0, flaky-3.7.0, reporter-0.5.2, expect-1.1.0, benchmark-4.0.0, travis-fold-1.3.0, enabler-2.1.0, jaraco.test-5.3.0, pyfakefs-5.2.2
asyncio: mode=strict
collected 0 items                                                                                                                                                                                                                                   

=============================================================================================================== no tests ran in 0.15s ===============================================================================================================
py37: exit 5 (2.98 seconds) /tmp/test> pytest pid=4281
py37: command failed but is marked ignore outcome so handling it as success
py37: commands[1]> python -m pytest
================================================================================================================ test session starts ================================================================================================================
platform sunos5 -- Python 3.7.16, pytest-7.4.0, pluggy-1.2.0
cachedir: .tox/py37/.pytest_cache
benchmark: 4.0.0 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
Using --randomly-seed=3050309400
rootdir: /tmp/test
plugins: xdist-3.2.1, kgb-7.1.1, subtests-0.10.0, jaraco.test-5.3.0, reporter-0.5.2, mypy-plugins-1.11.1, travis-fold-1.3.0, hypothesis-6.79.1, time-machine-2.9.0, regressions-2.4.2, benchmark-4.0.0, freezegun-0.4.2, checkdocs-2.9.0, mypy-0.10.3, datadir-1.4.1, teamcity-messages-1.32, pyfakefs-5.2.2, typeguard-4.0.0, flake8-1.1.1, black-multipy-1.0.1, flaky-3.7.0, backports.unittest-mock-1.5, randomly-3.12.0, expect-1.1.0, asyncio-0.21.0, pytest_freezer-0.4.8, rerunfailures-11.1.2, black-0.3.12, socket-0.6.0, perf-0.12.0, mock-3.11.1, forked-1.6.0, lazy-fixture-0.6.3, env-0.8.2, enabler-2.1.0, timeout-2.0.2, cov-4.1.0, console-scripts-1.4.0
asyncio: mode=strict
collected 0 items                                                                                                                                                                                                                                   

=============================================================================================================== no tests ran in 0.12s ===============================================================================================================
py37: exit 5 (2.87 seconds) /tmp/test> python -m pytest pid=4335
py37: command failed but is marked ignore outcome so handling it as success
  py37: OK (5.92=setup[0.06]+cmd[2.98,2.87] seconds)
  congratulations :) (6.51 seconds)
$

Please note the wrong Python version for plain pytest command above (commands[0]).

@jaraco
Copy link
Owner

jaraco commented Jun 25, 2023

Since this issue isn't specific to this project, I'll address the concerns in the skeleton project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants