-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: bug in variables ordering in NormalizedReluBounding #98
Conversation
Would it make sense to add a test that would catch this bug next time? |
The effect of the bug does only show in the training metrics. I can write a test that fails if the order of variables is still wrong after the initialization. |
Yes, more tests would be good. I would postpone this to a different pull request, given that we will refactor the interface of this function to make it simpler over the next days. The tests will also inevitably change for the refactor. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good.
A bug affects
anemoi.models.layers.bounding.NormalizedReluBounding
if the order of the variable douring the call in config does not match the order of the variables in the torch array. This a problem emerged only when multiple variables were bounded at the same time with this procedure.This is fixed by making sure that
self.norm_min_val
follows the same order asself.data_index
.