You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using the WeightNormalization wrapper in a keras model and I want to monitor the weights and gradients using tensorboard; but I believe the use of this boolean parameter
which checks whether the layer has been initialised; is causing the tensorboard histogram generation to fail. I changed it to a tf.dtype.int32 with shape (1,) which solved the issue but this seems a bit hacky.
Code to reproduce the issue
I am using the model.fit_generator() function with
The current implementation of WeightNormalization is very hacky. The biggest problem is that we remplace the layer's kernel attribute by a tensor, and in the graph. So if some tool expects the inner layer to have a kernel tf.Variable, well, it crashes. I tried to fix it in #1789 but I don't think the variable g is going to get updated with this fix.
System information
Describe the bug
I am using the
WeightNormalization
wrapper in akeras
model and I want to monitor the weights and gradients using tensorboard; but I believe the use of this boolean parameterhttps://github.com/tensorflow/addons/blob/master/tensorflow_addons/layers/wrappers.py#L104
which checks whether the layer has been initialised; is causing the tensorboard histogram generation to fail. I changed it to a
tf.dtype.int32
with shape(1,)
which solved the issue but this seems a bit hacky.Code to reproduce the issue
I am using the
model.fit_generator()
function withon a model which includes the
WeightNormalization
which trains fine without the tensorboard callback(I don't think the
write_grads
is doing anything, but that's a separate issue)here is the actual error message
The text was updated successfully, but these errors were encountered: