Closed
Description
Dear TFP team,
I am trying to use DenseVariational Layer with TimeDistributed Layer of Keras. However, I get the following error.
<ipython-input-228-ecfae02fb877> in <module>
14 conv_model = layers.TimeDistributed(layers.Flatten())(conv_model)
15 conv_model = layers.TimeDistributed(layers.Dropout(rate=0.1))(conv_model)
---> 16 conv_model = layers.TimeDistributed(tfp.layers.DenseVariational(64,make_posterior_fn=posterior,make_prior_fn=prior, activation='relu'))(conv_model)
17 conv_model = layers.TimeDistributed(tfp.layers.DenseFlipout(1, activation='relu',kernel_divergence_fn=kl_divergence_function, name='Output'))(conv_model)
18
~/anaconda3/envs/tfp/lib/python3.8/site-packages/keras/utils/traceback_utils.py in error_handler(*args, **kwargs)
65 except Exception as e: # pylint: disable=broad-except
66 filtered_tb = _process_traceback_frames(e.__traceback__)
---> 67 raise e.with_traceback(filtered_tb) from None
68 finally:
69 del filtered_tb
~/anaconda3/envs/tfp/lib/python3.8/site-packages/keras/engine/base_layer.py in compute_output_shape(self, input_shape)
826 self.__class__.__name__) from e
827 return tf.nest.map_structure(lambda t: t.shape, outputs)
--> 828 raise NotImplementedError(
829 'Please run in eager mode or implement the `compute_output_shape` '
830 'method on your layer (%s).' % self.__class__.__name__)
NotImplementedError: Exception encountered when calling layer "time_distributed_476" (type TimeDistributed).
Please run in eager mode or implement the `compute_output_shape` method on your layer (DenseVariational).
Call arguments received:
• inputs=tf.Tensor(shape=(None, 5, 384), dtype=float32)
• training=None
• mask=None
I checked the DenseVariational layer and TimeDistributed layers both have 'compute_output_shape' function already implemented. Can anyone please give me some leads?
I am out of ideas what to do next. I don't get this error when I use DenseFlipOut layer.
Thanks in advance.
Metadata
Metadata
Assignees
Labels
No labels