-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding another way of calculating gradients in downsamplers #540
Conversation
...trainer_server/internal/trainer/remote_downsamplers/abstract_matrix_downsampling_strategy.py
Show resolved
Hide resolved
...server/internal/trainer/remote_downsamplers/abstract_per_label_remote_downsample_strategy.py
Show resolved
Hide resolved
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #540 +/- ##
==========================================
- Coverage 82.84% 82.83% -0.01%
==========================================
Files 220 220
Lines 10202 10232 +30
==========================================
+ Hits 8452 8476 +24
- Misses 1750 1756 +6 ☔ View full report in Codecov by Sentry. |
modyn/trainer_server/internal/trainer/remote_downsamplers/remote_craig_downsampling.py
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, my only concern is (and I think codecov complains as well) that we should at least test whether the new code path does not crash or something. But the changes itself look good and I tried to validate that the old logic stay the same :)
.../tests/trainer_server/internal/trainer/remote_downsamplers/test_craig_remote_downsampling.py
Outdated
Show resolved
Hide resolved
...trainer_server/internal/trainer/remote_downsamplers/abstract_matrix_downsampling_strategy.py
Show resolved
Hide resolved
...server/internal/trainer/remote_downsamplers/abstract_per_label_remote_downsample_strategy.py
Show resolved
Hide resolved
...trainer_server/internal/trainer/remote_downsamplers/abstract_remote_downsampling_strategy.py
Show resolved
Hide resolved
Thanks, the new tests look good. Feel free to merge! |
This PR addresses #505.
A flag is introduced, to switch between using purely the last layer gradients, or the concatenation of last layer gradients and penultimate layer gradients, in selection algorithms.