-
-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Preserve the type in differentiation #149
Conversation
Codecov Report
@@ Coverage Diff @@
## master #149 +/- ##
=======================================
Coverage 74.86% 74.86%
=======================================
Files 24 24
Lines 768 768
=======================================
Hits 575 575
Misses 193 193
Continue to review full report at Codecov.
|
Thanks @matsueushi! |
Should we also add the testset in our regular tests? |
Do I need to add the test to test/activation.jl? Currently NNlib.jl doesn't have Zygote.jl dependency. Or is it better to update the tests of Zygote.jl? |
I would add a type-stability test to NNlib (just ensure that the activation functions don't change the type unnecessarily). |
If you are talking about the values of the activation functions, a test code is already defined in test/activation.jl and the previous definitions passed it. I modified it to test gradients. Lines 5 to 14 in 1c35815
|
Right, so what I mean is that we should have that same test for the gradients of the activation functions. :) |
Thanks, I see what you mean. |
The current implementations of
leakyrelu
,elu
andselu
returnFloat64
gradients forFloat32
inputs.cf. FluxML/Flux.jl#963. This PR is intended to preserve float precision for differentiation.
Before
After