Skip to content

fix: specify version of pytest #88

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Feb 5, 2022

Conversation

noritakaIzumi
Copy link
Contributor

The module Pytest recently released new version 7.0.0 and changed dependencies.
So we temporarily keep version compatible with 6.

Related to: exercism/python#2892

@BethanyG
Copy link
Member

BethanyG commented Feb 5, 2022

Oh my. It would appear that the GitBots are out to get us today. 😉 I must have done something pretty horrible, because they just do not want to cooperate. 😆 Ok. Let me see why the golden tests are failing...

@noritakaIzumi
Copy link
Contributor Author

noritakaIzumi commented Feb 5, 2022

I hope we don't get any worse...😅

2 months ago, the test CI passed with

platform linux -- Python 3.9.9, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/local/bin/python
cachedir: /tmp/python_cache_dir
rootdir: /opt/test-runner, configfile: pytest.ini
plugins: subtests-0.5.0

This time the test failed with

platform linux -- Python 3.9.10, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/local/bin/python
cachedir: /tmp/python_cache_dir
rootdir: /opt/test-runner, configfile: pytest.ini
plugins: subtests-0.6.0

@BethanyG
Copy link
Member

BethanyG commented Feb 5, 2022

@noritakaIzumi -- this may be more than what you want to do right now. So it is fine to say no. 😄 But the reason this change is failing is due to the "golden tests" that this repo runs.

The tests under /test/<test-name> have a results.json saved. The CI takes the <example-test-name>.py and the <example-test.py> files and uses the test runner to generate a new results.json. It then compares the new file to the old file and fails if they don't match exactly.

Usually, the mis-match is due to a code change within the runner itself that changes the json output -- but sometimes formats will also change depending on the version of the container being run, the version of Python being used, or subtle differences between the way one platform formats a particular Unicode character.

So .. like now ... it can be a royal pain, since you didn't even touch the runner code -- only the requirements. 🙄 BUUUUT .. it does look like the damn container happily upgraded itself to Python 3.10 meanwhile, so that would probably do it. 😱

So the way we "fix" this is to manually regenerate new results.json files for each case that fails, and check those in. Then have the CI run the check again to make sure they match...but maybe we can also just make sure that the container doesn't use Python 3.10?? 😆

Yeah. The upgrade bugs are really swarming today, aren't they??

@BethanyG
Copy link
Member

BethanyG commented Feb 5, 2022

Nope. I'm wrong. It has to be the subtests module. The Python one (I misread) is at 3.9.10

Copy link
Member

@BethanyG BethanyG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for bearing with me. 😄 🌟 And for making this change!

@BethanyG BethanyG merged commit 46ab302 into exercism:main Feb 5, 2022
@noritakaIzumi
Copy link
Contributor Author

Thank you for fixing test runner😄👏
We want to raise version in the future...💪

@BethanyG
Copy link
Member

BethanyG commented Feb 5, 2022

LOL. Yes. Upgrading is going to be a fun project. 😁 But we do want to upgrade to PyTest 7.0 (and probably subtest 0.6.0) soon.

It looks like the upgrade is going to need:

  • careful checking of requirements
  • double-checking that all versions of Python 3.6-3.9 work with it
  • testing that the tests and exercises are all ok
  • regenerating golden files where needed
  • changing test files if needed
  • changes to data.py code in this repo (because of toml vs tomli) and in the Python one (for the same reasons)

I'll write up a few issues. LMK if you're interested in working on any of the tasks. 😄

@BethanyG BethanyG mentioned this pull request Feb 10, 2022
8 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants