From f7bbd5a2c48958c429b5d043022625a896da24bb Mon Sep 17 00:00:00 2001 From: Kyle Daruwalla Date: Sat, 30 Jan 2021 16:45:57 -0600 Subject: [PATCH 1/4] Make N-dimensional dropout docstring clearer. --- src/layers/normalise.jl | 3 +++ 1 file changed, 3 insertions(+) diff --git a/src/layers/normalise.jl b/src/layers/normalise.jl index 5f9e116f29..4fb842c29c 100644 --- a/src/layers/normalise.jl +++ b/src/layers/normalise.jl @@ -51,6 +51,9 @@ end Dropout layer. In the forward pass, apply the [`Flux.dropout`](@ref) function on the input. +For N-D dropout layers (e.g. `Dropout2d` or `Dropout3d` in PyTorch), +specify the `dims` keyword (i.e. `Dropout(p; dims = 2)` is a 2D dropout layer). + Does nothing to the input once [`Flux.testmode!`](@ref) is `true`. """ mutable struct Dropout{F,D} From ad836662e86ce0c20805598b5e4bbbe0a2f3c631 Mon Sep 17 00:00:00 2001 From: Kyle Daruwalla Date: Sat, 30 Jan 2021 16:49:42 -0600 Subject: [PATCH 2/4] Fix typo --- src/layers/normalise.jl | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/layers/normalise.jl b/src/layers/normalise.jl index 4fb842c29c..882c03a53f 100644 --- a/src/layers/normalise.jl +++ b/src/layers/normalise.jl @@ -52,7 +52,7 @@ end Dropout layer. In the forward pass, apply the [`Flux.dropout`](@ref) function on the input. For N-D dropout layers (e.g. `Dropout2d` or `Dropout3d` in PyTorch), -specify the `dims` keyword (i.e. `Dropout(p; dims = 2)` is a 2D dropout layer). +specify the `dims` keyword (i.e. `Dropout(p; dims = 3)` is a 2D dropout layer). Does nothing to the input once [`Flux.testmode!`](@ref) is `true`. """ From 94be20fc0a853905a2b49845dce96653c0a7cfab Mon Sep 17 00:00:00 2001 From: Kyle Daruwalla Date: Sat, 30 Jan 2021 16:56:34 -0600 Subject: [PATCH 3/4] More informative explanation from review Co-authored-by: Brian Chen --- src/layers/normalise.jl | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/src/layers/normalise.jl b/src/layers/normalise.jl index 882c03a53f..29391fbf29 100644 --- a/src/layers/normalise.jl +++ b/src/layers/normalise.jl @@ -51,8 +51,8 @@ end Dropout layer. In the forward pass, apply the [`Flux.dropout`](@ref) function on the input. -For N-D dropout layers (e.g. `Dropout2d` or `Dropout3d` in PyTorch), -specify the `dims` keyword (i.e. `Dropout(p; dims = 3)` is a 2D dropout layer). +To apply dropout along an certain dimension (e.g. zeroing out an entire channel's feature map), +specify the `dims` keyword (i.e. `Dropout(p; dims = 3)` is a 2D dropout layer on WHCN input). Does nothing to the input once [`Flux.testmode!`](@ref) is `true`. """ @@ -423,4 +423,4 @@ function Base.show(io::IO, l::GroupNorm) print(io, "GroupNorm($(join(size(l.β), ", "))") (l.λ == identity) || print(io, ", λ = $(l.λ)") print(io, ")") -end \ No newline at end of file +end From 5cac6f6f9322ac946a2d1c9d958cf57a46927127 Mon Sep 17 00:00:00 2001 From: Kyle Daruwalla Date: Sat, 30 Jan 2021 17:03:09 -0600 Subject: [PATCH 4/4] Remove more wordiness --- src/layers/normalise.jl | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/src/layers/normalise.jl b/src/layers/normalise.jl index 29391fbf29..51df2e02fc 100644 --- a/src/layers/normalise.jl +++ b/src/layers/normalise.jl @@ -51,8 +51,9 @@ end Dropout layer. In the forward pass, apply the [`Flux.dropout`](@ref) function on the input. -To apply dropout along an certain dimension (e.g. zeroing out an entire channel's feature map), -specify the `dims` keyword (i.e. `Dropout(p; dims = 3)` is a 2D dropout layer on WHCN input). +To apply dropout along certain dimension(s), specify the `dims` keyword. +e.g. `Dropout(p; dims = 3)` will randomly zero out entire channels on WHCN input +(also called 2D dropout). Does nothing to the input once [`Flux.testmode!`](@ref) is `true`. """