Skip to content

add activation on last logic #2924

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

billyang98
Copy link

Summary:

context

Adding parity for activation on last flag to make module more configurable.

activation_on_last is a flag that toggles the given (or default) activation function to the last layer of the MLP. Typically ALL layers of the MLP will have the activation function. If it is set to false, then the last layer will not have the activation function applied. This is so users can optionally use the raw MLP output for their own customized needs.

Reviewed By: TroyGarden

Differential Revision: D73691616

Summary:
# context

Adding parity for activation on last flag to make module more configurable.

activation_on_last is a flag that toggles the given (or default) activation function to the last layer of the MLP. Typically ALL layers of the MLP will have the activation function. If it is set to false, then the last layer will not have the activation function applied. This is so users can optionally use the raw MLP output for their own customized needs.

Reviewed By: TroyGarden

Differential Revision: D73691616
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 29, 2025
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D73691616

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants