Skip to content

Commit

Permalink
Added A General Swish Activation Function inNeural Networks (TheAlgor…
Browse files Browse the repository at this point in the history
…ithms#10415)

* Added A General Swish Activation Function inNeural Networks

* Added the general swish function in the SiLU function and renamed it as swish.py

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: Shivansh Bhatnagar <shivansh.bhatnagar.mat22@iitbhu.ac.in>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
  • Loading branch information
3 people authored Oct 18, 2023
1 parent 361f64c commit 572de4f
Showing 1 changed file with 20 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
This script is inspired by a corresponding research paper.
* https://arxiv.org/abs/1710.05941
* https://blog.paperspace.com/swish-activation-function/
"""

import numpy as np
Expand Down Expand Up @@ -49,6 +50,25 @@ def sigmoid_linear_unit(vector: np.ndarray) -> np.ndarray:
return vector * sigmoid(vector)


def swish(vector: np.ndarray, trainable_parameter: int) -> np.ndarray:
"""
Parameters:
vector (np.ndarray): A numpy array consisting of real values
trainable_parameter: Use to implement various Swish Activation Functions
Returns:
swish_vec (np.ndarray): The input numpy array, after applying swish
Examples:
>>> swish(np.array([-1.0, 1.0, 2.0]), 2)
array([-0.11920292, 0.88079708, 1.96402758])
>>> swish(np.array([-2]), 1)
array([-0.23840584])
"""
return vector * sigmoid(trainable_parameter * vector)


if __name__ == "__main__":
import doctest

Expand Down

0 comments on commit 572de4f

Please sign in to comment.