Skip to content

Commit

Permalink
Fix docstrings
Browse files Browse the repository at this point in the history
  • Loading branch information
darsnack authored Jan 27, 2022
1 parent 15ac6cd commit f922c16
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions src/layers/normalise.jl
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ function _dropout_mask(rng, x, p; dims=:)
end

"""
Dropout(p; dims=:, rng = default_rng())
Dropout(p; dims=:, rng = rng_from_array())
Dropout layer. In the forward pass, apply the [`Flux.dropout`](@ref) function on the input.
Expand Down Expand Up @@ -100,7 +100,7 @@ function Base.show(io::IO, d::Dropout)
end

"""
AlphaDropout(p; rng = default_rng())
AlphaDropout(p; rng = rng_from_array())
A dropout layer. Used in
[Self-Normalizing Neural Networks](https://arxiv.org/abs/1706.02515).
Expand Down

0 comments on commit f922c16

Please sign in to comment.