-
Notifications
You must be signed in to change notification settings - Fork 91
Added Leaky ReLU activation function (1D & 3D) #123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Thanks @Spnetic-5! A few things:
|
@milancurcic Please review, I have made suggested changes. |
Thank you, we're almost there. See my 3rd comment from above:
The build currently fails because neural-fortran/src/nf/nf_activation_1d.f90 Lines 21 to 26 in 7039276
|
Looks good. Re: |
Thanks, Jeremie, I think that's a good idea. Having an |
Where and how should I define the derived type, could you please give me example or steps to implement this approach. |
As I'm still not familiar with entire Fortran codebase and working, I tried implementing the optional |
src/nf/nf_activation_3d.f90
Outdated
pure function leaky_relu_prime(x, alpha) result(res) | ||
! First derivative of the Leaky Rectified Linear Unit (Leaky ReLU) activation function. | ||
real, intent(in) :: x(:,:,:) | ||
real, intent(in) :: alpha |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should be optional
everywhere.
In this case, I would use something like the stdlib
optval
function.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd like to evaluate the performance impact of using optval
since an activation function is invoked many times and are at the very bottom of the call stack. It would be a good opportunity to use stdlib as a dependency, and via fpm it will be easy. But we'd need to add it as a CMake dependency, which will require a bit more CMake code here. I'm wary of making this PR any larger because it's @Spnetic-5's first PR.
Once we do activation_params
derived type (in a separate PR), there's a technique that will allow us to not rely on present
or optval
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good points! I agree with you.
I think it's fine that we do just an optional |
@Spnetic-5 Let's just make |
This comment was marked as outdated.
This comment was marked as outdated.
What needs changing in nf_conv2d_layer_submodule.f90? |
Thanks for clearing stuff @milancurcic & @jvdp1 , now I'm facing following error while building:
|
neural-fortran/src/nf/nf_dense_layer.f90 Lines 15 to 33 in 7039276
you can add
After you do these steps, I believe it should work correctly. |
To keep this PR small, let's implement the activation params approach in a separate PR. Thanks @Spnetic-5! |
@milancurcic