Skip to content

Thresholded Linear Unit #857

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 11 commits into from
Jan 13, 2020
Merged

Thresholded Linear Unit #857

merged 11 commits into from
Jan 13, 2020

Conversation

AakashKumarNain
Copy link
Member

This activation unit is introduced in Filter Response Normalization

@Squadrick
Copy link
Member

@AakashKumarNain Any reason to put this under layers as opposed to activations? Also, could you update the README.md?

@AakashKumarNain
Copy link
Member Author

@Squadrick we have two learnable parameters for this activation, similar to PReLU. It makes more sense to build a layer for this and if we want to include it in the activations package, then the activation function should initalize an instance of this layer first and then call the activation function on the inputs. Lemme know what you think

@Squadrick
Copy link
Member

@AakashKumarNain Yeah, you're right. I'd forgotten that tf.keras expects activations to be stateless. Let's not bother with creating anything under activations. There's a precedent with core tf.keras making PReLU a layer.

Copy link
Member

@Squadrick Squadrick left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@AakashKumarNain Also test for serialization.

@AakashKumarNain
Copy link
Member Author

@AakashKumarNain Also test for serialization.

Can you elaborate on this? Any example?

@Squadrick
Copy link
Member

@AakashKumarNain Also test for serialization.

Can you elaborate on this? Any example?

def test_serialization(self, base_layer, rnn):

@AakashKumarNain
Copy link
Member Author

Will this suffice?

def test_serialization(self):
    tlu = TLU(affine=True, alpha_initializer='ones', tau_initializer='ones')
    serialized_tlu = tf.keras.layers.serialize(tlu)
    new_layer = tf.keras.layers.deserialize(serialized_tlu, custom_objects={'TLU':TLU})
    self.assertEqual(tlu.get_config(), new_layer.get_config())

@Squadrick
Copy link
Member

@AakashKumarNain Yeah, that works. It'll work without the custom_objects as well.

@AakashKumarNain
Copy link
Member Author

@seanpmorgan @Squadrick I think we can merge it now

Copy link
Member

@Squadrick Squadrick left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@Squadrick Squadrick merged commit 8b08fc4 into tensorflow:master Jan 13, 2020
@AakashKumarNain
Copy link
Member Author

Thanks

@AakashKumarNain
Copy link
Member Author

I forgot to update the README. Will do it later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants