-
Notifications
You must be signed in to change notification settings - Fork 72
Fix: move additional metrics from approximator to networks #500
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Conversation
Supplying the additional metrics for inference and summary networks via the approximators compile method caused problems during deseralization (#497). This can be resolved nicely by moving the metrics directly to the networks' constructors, analogous to how Keras normally handles custom metrics in layers. As summary networks and inference networks inherit from the respective base classes, this change only requires minor adaptations.
41e2967
to
8d296e9
Compare
023db06
to
4c7a5a2
Compare
This change makes it more capable for our purposes by allowing any serializable value, not only the base types in the auto-config. We have to check if this brings any footguns/downsides, or whether this is fine for our setting. It also replaces Keras' functions with our custom serialization functions.
4c7a5a2
to
51dff0d
Compare
@LarsKue @stefanradev93 @han-ol I encountered the problem that to pass metrics to the networks in their constructors, we would have to specify all |
I would let @LarsKue chip in on this, as I am bit concerned about having more and more "custom" variants of basic keras containers (e.g., Sequential, Layer,...). |
Thanks for the comment, I can understand this, relying too much on hacking/modifying Keras internals has the downside that our code might become more fragile with respect to changes in Keras. I think the ability to pass non-basic types to our inference and summary network base classes would be nice to have, the behavior in this PR is one example of this.
|
I would be on board if it was a lightweight wrapper, but it seems that there is a lot of copied code from keras, which we should avoid in general, imo. @vpratz Can we solve this through monkey-patching somehow, like with the |
If this is not breaking in any downstream task and only adds to the existing functionality, we may consider doing a PR to keras or asking them to implement it (or why they decided not to)? |
@LarsKue Thanks for taking a look. I'll take a look at alternative ways to achieve this behavior. |
@LarsKue Thanks a lot, this was a really good pointer. I have now implemented a wrapper around the constructor that is applied inside the There is still some copied code, but I think it is now limited to the part that is required for the feature to behave as similar to the Keras implementation as possible, apart from the desired changes (we might not need the part regarding the With those changes, we could remove the @stefanradev93 @LarsKue Please take another look and let me know what you think (the most relevant part is in |
Thanks, this looks much better. Since this is a sensitive change, I think we should extensively test it before rolling it out. Otherwise, green light from my side! |
This is what most classifier metrics expect, and contains more detail for the metrics to work with.
I forgot to make the same changes in the |
@paul-buerkner I would like to get a review from @stefanradev93 before we merge this, as the changes affect multiple parts of the code. Also, I would wait until we have decided how we proceed on #525... |
I see some weird behavior due to the inheritance, I will have to give this another look... |
- get config has to be manually specified in the base classes, so that the config is stored even when to subclass overrides get_config - to preserve the auto_config behavior, we have to use the `python_utils.default` decorator from, which marks them as default methods. This allows detecting if a subclass has overridden them. This is the same mechanism that Keras uses - moved setting the `custom_metrics` parameter after the `super().__init__` calls, as the tracking is managed in setattr - extended some tests to use metrics
Ok, I think it works. The general mechanism for getting the config of a subclass a) The subclass overrides
b) The subclass does not override
The def default(method):
"""Decorates a method to detect overrides in subclasses."""
method._is_default = True
return method
def is_default(method):
"""Check if a method is decorated with the `default` wrapper."""
return getattr(method, "_is_default", False) To adapt this setup for us, we want our base classes ( The easiest way to achieve this is a |
@stefanradev93 @LarsKue I have fixed the problem I found (see above) and cleaned up the code a bit. Could you please take another look? |
Supplying the additional metrics for inference and summary networks via the approximators compile method caused problems during deseralization (#497). This can be resolved nicely by moving the metrics directly to the networks' constructors, analogous to how Keras normally handles custom metrics in layers.
As summary networks and inference networks inherit from the respective base classes, this change only requires minor adaptations. Calls to
layer_kwargs
are now only used in classes that directly inherit fromkeras.Layer
, and have been moved intoInferenceNetwork
andSummaryNetwork
.Fixes #497.