Skip to content

Commit a125d63

Browse files
authored
fix bn docs (#30096)
1 parent 3342477 commit a125d63

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

python/paddle/nn/layer/norm.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -652,7 +652,7 @@ class BatchNorm1D(_BatchNormBase):
652652
r"""
653653
Applies Batch Normalization over a 2D or 3D input (a mini-batch of 1D inputswith additional channel dimension) as described in the paper Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift .
654654
655-
When track_running_stats = False, the :math:`\\mu_{\\beta}`
655+
When use_global_stats = False, the :math:`\\mu_{\\beta}`
656656
and :math:`\\sigma_{\\beta}^{2}` are the statistics of one mini-batch.
657657
Calculated as follows:
658658
@@ -663,7 +663,7 @@ class BatchNorm1D(_BatchNormBase):
663663
\\sigma_{\\beta}^{2} &\\gets \\frac{1}{m} \\sum_{i=1}^{m}(x_i - \\
664664
\\mu_{\\beta})^2 \\qquad &//\ mini-batch\ variance \\\\
665665
666-
When track_running_stats = True, the :math:`\\mu_{\\beta}`
666+
When use_global_stats = True, the :math:`\\mu_{\\beta}`
667667
and :math:`\\sigma_{\\beta}^{2}` are not the statistics of one mini-batch.
668668
They are global or running statistics (moving_mean and moving_variance). It usually got from the
669669
pre-trained model. Calculated as follows:
@@ -743,7 +743,7 @@ class BatchNorm2D(_BatchNormBase):
743743
r"""
744744
Applies Batch Normalization over a 4D input (a mini-batch of 2D inputswith additional channel dimension) as described in the paper Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift .
745745
746-
When track_running_stats = False, the :math:`\\mu_{\\beta}`
746+
When use_global_stats = False, the :math:`\\mu_{\\beta}`
747747
and :math:`\\sigma_{\\beta}^{2}` are the statistics of one mini-batch.
748748
Calculated as follows:
749749
@@ -754,7 +754,7 @@ class BatchNorm2D(_BatchNormBase):
754754
\\sigma_{\\beta}^{2} &\\gets \\frac{1}{m} \\sum_{i=1}^{m}(x_i - \\
755755
\\mu_{\\beta})^2 \\qquad &//\ mini-batch\ variance \\\\
756756
757-
When track_running_stats = True, the :math:`\\mu_{\\beta}`
757+
When use_global_stats = True, the :math:`\\mu_{\\beta}`
758758
and :math:`\\sigma_{\\beta}^{2}` are not the statistics of one mini-batch.
759759
They are global or running statistics (moving_mean and moving_variance). It usually got from the
760760
pre-trained model. Calculated as follows:
@@ -832,7 +832,7 @@ class BatchNorm3D(_BatchNormBase):
832832
r"""
833833
Applies Batch Normalization over a 5D input (a mini-batch of 3D inputswith additional channel dimension) as described in the paper Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift .
834834
835-
When track_running_stats = False, the :math:`\\mu_{\\beta}`
835+
When use_global_stats = False, the :math:`\\mu_{\\beta}`
836836
and :math:`\\sigma_{\\beta}^{2}` are the statistics of one mini-batch.
837837
Calculated as follows:
838838
@@ -843,7 +843,7 @@ class BatchNorm3D(_BatchNormBase):
843843
\\sigma_{\\beta}^{2} &\\gets \\frac{1}{m} \\sum_{i=1}^{m}(x_i - \\
844844
\\mu_{\\beta})^2 \\qquad &//\ mini-batch\ variance \\\\
845845
846-
When track_running_stats = True, the :math:`\\mu_{\\beta}`
846+
When use_global_stats = True, the :math:`\\mu_{\\beta}`
847847
and :math:`\\sigma_{\\beta}^{2}` are not the statistics of one mini-batch.
848848
They are global or running statistics (moving_mean and moving_variance). It usually got from the
849849
pre-trained model. Calculated as follows:

0 commit comments

Comments
 (0)