Skip to content

Commit 58fb2b8

Browse files
committed
Improve BatchNorm documentation
1 parent 063dd7d commit 58fb2b8

File tree

1 file changed

+10
-2
lines changed

1 file changed

+10
-2
lines changed

keras/layers/normalization.py

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,9 @@
44

55

66
class BatchNormalization(Layer):
7-
'''Normalize the activations of the previous layer at each batch.
7+
'''Normalize the activations of the previous layer at each batch,
8+
i.e. applies a transformation that maintains the mean activation
9+
close to 0. and the activation standard deviation close to 1.
810
911
# Input shape
1012
Arbitrary. Use the keyword argument `input_shape`
@@ -18,7 +20,13 @@ class BatchNormalization(Layer):
1820
epsilon: small float > 0. Fuzz parameter.
1921
mode: integer, 0 or 1.
2022
- 0: feature-wise normalization.
21-
- 1: sample-wise normalization.
23+
If the input has multiple feature dimensions,
24+
each will be normalized separately
25+
(e.g. for an image input with shape
26+
`(channels, rows, cols)`,
27+
each combination of a channel, row and column
28+
will be normalized separately).
29+
- 1: sample-wise normalization. This mode assumes a 2D input.
2230
momentum: momentum in the computation of the
2331
exponential average of the mean and standard deviation
2432
of the data, for feature-wise normalization.

0 commit comments

Comments
 (0)