Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WeightNormalization not working with TensorBoard histograms #1801

Open
TWJubb opened this issue May 8, 2020 · 1 comment
Open

WeightNormalization not working with TensorBoard histograms #1801

TWJubb opened this issue May 8, 2020 · 1 comment
Labels
bug Something isn't working layers

Comments

@TWJubb
Copy link

TWJubb commented May 8, 2020

System information

  • TensorFlow version 2.1.0
  • TensorFlow-Addons version 0.9.1
  • Python version 3.6
  • Is GPU used? yes

Describe the bug

I am using the WeightNormalization wrapper in a keras model and I want to monitor the weights and gradients using tensorboard; but I believe the use of this boolean parameter

https://github.com/tensorflow/addons/blob/master/tensorflow_addons/layers/wrappers.py#L104

which checks whether the layer has been initialised; is causing the tensorboard histogram generation to fail. I changed it to a tf.dtype.int32 with shape (1,) which solved the issue but this seems a bit hacky.

Code to reproduce the issue

I am using the model.fit_generator() function with

callbacks = [ tf.keras.callbacks.TensorBoard(write_grads=True,histogram_freq=1,log_dir="...")]

on a model which includes the WeightNormalization which trains fine without the tensorboard callback

(I don't think the write_grads is doing anything, but that's a separate issue)

here is the actual error message

InvalidArgumentError: Value for attr 'T' of bool is not in the list of allowed values: float, double, int32, uint8, int16, int8, int64, bfloat16, uint16, half, uint32, uint64
	; NodeDef: {{node WriteHistogramSummary}}; Op<name=WriteHistogramSummary; signature=writer:resource, step:int64, tag:string, values:T -> ; attr=T:type,default=DT_FLOAT,allowed=[DT_FLOAT, DT_DOUBLE, DT_INT32, DT_UINT8, DT_INT16, DT_INT8, DT_INT64, DT_BFLOAT16, DT_UINT16, DT_HALF, DT_UINT32, DT_UINT64]; is_stateful=true> [Op:WriteHistogramSummary] name: causal_conv2d_3/weight_normalization_42/initialized_0/
@gabrieldemarmiesse
Copy link
Member

The current implementation of WeightNormalization is very hacky. The biggest problem is that we remplace the layer's kernel attribute by a tensor, and in the graph. So if some tool expects the inner layer to have a kernel tf.Variable, well, it crashes. I tried to fix it in #1789 but I don't think the variable g is going to get updated with this fix.

@seanpmorgan seanpmorgan added bug Something isn't working layers labels Jun 3, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working layers
Projects
None yet
Development

No branches or pull requests

3 participants