-
Notifications
You must be signed in to change notification settings - Fork 201
Add functional normalizations #707
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
kaiidams
commented
Aug 20, 2022
- torch.nn.functional.batch_norm
- torch.nn.functional.group_norm
- torch.nn.functional.instance_norm
- torch.nn.functional.layer_norm
- torch.nn.functional.local_response_norm
- torch.nn.functional.batch_norm - torch.nn.functional.group_norm - torch.nn.functional.instance_norm - torch.nn.functional.layer_norm - torch.nn.functional.local_response_norm
Unittest
|
@kaiidams -- thank you for your prolific contributions! |
I was looking for |
No, I do not. Open a new issue to track, before you do a PR. |
The PR added functional versions of normalization layers like torch.nn.LayerNorm and BatchNorm. torch.nn.functional.normalize is for normalization of vectors. They are both named "normalize" but are different. I think you can implement them and make a PR if you need them. |
Thanks @NiklasGustafsson and @kaiidams for your answers! |