-
Notifications
You must be signed in to change notification settings - Fork 2
[patch] 0.9.5 #419
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[patch] 0.9.5 #419
Conversation
* Refactor storage test * Remove empty line * Rename workflow in tests With the introduction of a pickle backend, this test wound up failing. I haven't been able to work out exactly why, but since pickle is much faster my guess is that something in the test parallelization gets in a race condition with deleting a saved workflow from another test. Simply renaming it here solved the issue, which is an innocuous change. * Introduce `pickle` as a backend for saving * Fix root cause of storage conflict Ok, the reason workflow storage tests were failing after introducing pickle was that the original version of the test had the wrong order of the try/finally scope and the for-loop scope. I got away with it earlier because of interplay between the outer-most member of the loop and the default storage backend. Actually fixing the problem is as simple as ensuring the "finally delete" clause happens after _each_ loop step. This does that and reverts the renaming of the workflow. * Again, correctly order try/finally and for-loops
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
* Refactor storage test * Remove empty line * Rename workflow in tests With the introduction of a pickle backend, this test wound up failing. I haven't been able to work out exactly why, but since pickle is much faster my guess is that something in the test parallelization gets in a race condition with deleting a saved workflow from another test. Simply renaming it here solved the issue, which is an innocuous change. * Introduce `pickle` as a backend for saving * Fix root cause of storage conflict Ok, the reason workflow storage tests were failing after introducing pickle was that the original version of the test had the wrong order of the try/finally scope and the for-loop scope. I got away with it earlier because of interplay between the outer-most member of the loop and the default storage backend. Actually fixing the problem is as simple as ensuring the "finally delete" clause happens after _each_ loop step. This does that and reverts the renaming of the workflow. * Again, correctly order try/finally and for-loops * Remove keyword argument from pure-decorator You're only supposed to use it as a decorator to start with, so the kwarg was senseless * Add factory import source This is necessary (but not sufficient) to get `as_dataclass_node` decorated classes to pickle. The field name is a bit off compared to `Function` and `Macro`, as now we decorate a class definition instead of a function definition, but it's close. * Bring the dataclass node in line with function and macro * Leverage new pyiron_snippets.factory stuff to find the class * Mangle the stored dataclass qualname so it can be found later * Add tests * Update docs examples to reflect new naming * Update snippets dependency * [dependabot skip] Update env file * Use new pyiron_snippets syntax consistently * Expose `as_dataclass_node` in the API Now that it's pickling as well as anything else * Format black --------- Co-authored-by: pyiron-runner <pyiron@mpie.de>
Coverage summary from CodacySee diff coverage on Codacy
Coverage variation details
Coverage variation is the difference between the coverage for the head and common ancestor commits of the pull request branch: Diff coverage details
Diff coverage is the percentage of lines that are covered by tests out of the coverable lines that the pull request added or modified: See your quality gate settings Change summary preferencesCodacy stopped sending the deprecated coverage status on June 5th, 2024. Learn more |
* Refactor storage test * Remove empty line * Rename workflow in tests With the introduction of a pickle backend, this test wound up failing. I haven't been able to work out exactly why, but since pickle is much faster my guess is that something in the test parallelization gets in a race condition with deleting a saved workflow from another test. Simply renaming it here solved the issue, which is an innocuous change. * Introduce `pickle` as a backend for saving * Fix root cause of storage conflict Ok, the reason workflow storage tests were failing after introducing pickle was that the original version of the test had the wrong order of the try/finally scope and the for-loop scope. I got away with it earlier because of interplay between the outer-most member of the loop and the default storage backend. Actually fixing the problem is as simple as ensuring the "finally delete" clause happens after _each_ loop step. This does that and reverts the renaming of the workflow. * Again, correctly order try/finally and for-loops * Remove keyword argument from pure-decorator You're only supposed to use it as a decorator to start with, so the kwarg was senseless * Add factory import source This is necessary (but not sufficient) to get `as_dataclass_node` decorated classes to pickle. The field name is a bit off compared to `Function` and `Macro`, as now we decorate a class definition instead of a function definition, but it's close. * Bring the dataclass node in line with function and macro * Leverage new pyiron_snippets.factory stuff to find the class * Mangle the stored dataclass qualname so it can be found later * Add tests * Update docs examples to reflect new naming * Update snippets dependency * [dependabot skip] Update env file * Use new pyiron_snippets syntax consistently * Expose `as_dataclass_node` in the API Now that it's pickling as well as anything else * [patch] Fall back on cloudpickle When the pickle backend fails * Format black --------- Co-authored-by: pyiron-runner <pyiron@mpie.de>
Pull Request Test Coverage Report for Build 10373636401Details
💛 - Coveralls |
* Refactor storage test * Remove empty line * Rename workflow in tests With the introduction of a pickle backend, this test wound up failing. I haven't been able to work out exactly why, but since pickle is much faster my guess is that something in the test parallelization gets in a race condition with deleting a saved workflow from another test. Simply renaming it here solved the issue, which is an innocuous change. * Introduce `pickle` as a backend for saving * Fix root cause of storage conflict Ok, the reason workflow storage tests were failing after introducing pickle was that the original version of the test had the wrong order of the try/finally scope and the for-loop scope. I got away with it earlier because of interplay between the outer-most member of the loop and the default storage backend. Actually fixing the problem is as simple as ensuring the "finally delete" clause happens after _each_ loop step. This does that and reverts the renaming of the workflow. * Again, correctly order try/finally and for-loops * Remove keyword argument from pure-decorator You're only supposed to use it as a decorator to start with, so the kwarg was senseless * Add factory import source This is necessary (but not sufficient) to get `as_dataclass_node` decorated classes to pickle. The field name is a bit off compared to `Function` and `Macro`, as now we decorate a class definition instead of a function definition, but it's close. * Bring the dataclass node in line with function and macro * Leverage new pyiron_snippets.factory stuff to find the class * Mangle the stored dataclass qualname so it can be found later * Add tests * Update docs examples to reflect new naming * Update snippets dependency * [dependabot skip] Update env file * Use new pyiron_snippets syntax consistently * Expose `as_dataclass_node` in the API Now that it's pickling as well as anything else * [patch] Fall back on cloudpickle When the pickle backend fails * [minor] Make pickle the default storage backend * Format black * Fall back on loading any storage contents Regardless of what the specified storage backend was. * Format black --------- Co-authored-by: pyiron-runner <pyiron@mpie.de>
Codacy nits are all just codacy hating pickle. |
A landing page for the stack of PRs for the next patch bump. All have tests passing on my local machine, but are waiting for a new version of
pyiron_base
which allows the most-recent version ofpyiron_snippets
. In the meantime, I want to merge down the stack for some sanity.TODO: