Skip to content

Commit

Permalink
Style sigmoid function in harmony with pep guideness (TheAlgorithms#6677
Browse files Browse the repository at this point in the history
)

* Style sigmoid function in harmony with pep guideness

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Apply suggestions from code review

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Tianyi Zheng <tianyizheng02@gmail.com>
  • Loading branch information
3 people authored Sep 4, 2023
1 parent 421ace8 commit 5a4ea23
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions neural_network/back_propagation_neural_network.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,8 @@
from matplotlib import pyplot as plt


def sigmoid(x):
return 1 / (1 + np.exp(-1 * x))
def sigmoid(x: np.ndarray) -> np.ndarray:
return 1 / (1 + np.exp(-x))


class DenseLayer:
Expand Down

0 comments on commit 5a4ea23

Please sign in to comment.