-
Notifications
You must be signed in to change notification settings - Fork 615
Thresholded Linear Unit #857
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@AakashKumarNain Any reason to put this under |
@Squadrick we have two learnable parameters for this activation, similar to |
@AakashKumarNain Yeah, you're right. I'd forgotten that |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@AakashKumarNain Also test for serialization.
Can you elaborate on this? Any example? |
|
Will this suffice? def test_serialization(self):
tlu = TLU(affine=True, alpha_initializer='ones', tau_initializer='ones')
serialized_tlu = tf.keras.layers.serialize(tlu)
new_layer = tf.keras.layers.deserialize(serialized_tlu, custom_objects={'TLU':TLU})
self.assertEqual(tlu.get_config(), new_layer.get_config()) |
@AakashKumarNain Yeah, that works. It'll work without the |
@seanpmorgan @Squadrick I think we can merge it now |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Thanks |
I forgot to update the README. Will do it later. |
This activation unit is introduced in Filter Response Normalization