Skip to content

Commit

Permalink
fix docs
Browse files Browse the repository at this point in the history
  • Loading branch information
yghstill committed Aug 31, 2022
1 parent e67ea57 commit faf5ff3
Showing 1 changed file with 10 additions and 26 deletions.
36 changes: 10 additions & 26 deletions python/paddle/nn/functional/loss.py
Original file line number Diff line number Diff line change
Expand Up @@ -184,27 +184,19 @@ def fluid_softmax_with_cross_entropy(logits,
1) Hard label (one-hot label, so every sample has exactly one class)
.. math::
loss_j = -\\text{logits}_{label_j} +
\\log\\left(\\sum_{i=0}^{K}\\exp(\\text{logits}_i)\\right), j = 1,..., K
\\loss_j=-\text{logits}_{label_j} +\log\left(\sum_{i=0}^{K}\exp(\text{logits}_i)\right), j = 1,..., K
2) Soft label (each sample can have a distribution over all classes)
.. math::
loss_j = -\\sum_{i=0}^{K}\\text{label}_i
\\left(\\text{logits}_i - \\log\\left(\\sum_{i=0}^{K}
\\exp(\\text{logits}_i)\\right)\\right), j = 1,...,K
\\loss_j= -\sum_{i=0}^{K}\text{label}_i\left(\text{logits}_i - \log\left(\sum_{i=0}^{K}\exp(\text{logits}_i)\right)\right), j = 1,...,K
3) If :attr:`numeric_stable_mode` is :attr:`True`, softmax is calculated first by:
.. math::
max_j &= \\max_{i=0}^{K}{\\text{logits}_i}
log\\_max\\_sum_j &= \\log\\sum_{i=0}^{K}\\exp(logits_i - max_j)
softmax_j &= \\exp(logits_j - max_j - {log\\_max\\_sum}_j)
\\max_j&=\max_{i=0}^{K}{\text{logits}_i} \\
log\_max\_sum_j &= \log\sum_{i=0}^{K}\exp(logits_i - max_j)\\
softmax_j &= \exp(logits_j - max_j - {log\_max\_sum}_j)
and then cross entropy loss is calculated by softmax and label.
Expand Down Expand Up @@ -2049,27 +2041,19 @@ def softmax_with_cross_entropy(logits,
1) Hard label (one-hot label, so every sample has exactly one class)
.. math::
loss_j = -\\text{logits}_{label_j} +
\\log\\left(\\sum_{i=0}^{K}\\exp(\\text{logits}_i)\\right), j = 1,..., K
\\loss_j=-\text{logits}_{label_j} +\log\left(\sum_{i=0}^{K}\exp(\text{logits}_i)\right), j = 1,..., K
2) Soft label (each sample can have a distribution over all classes)
.. math::
loss_j = -\\sum_{i=0}^{K}\\text{label}_i
\\left(\\text{logits}_i - \\log\\left(\\sum_{i=0}^{K}
\\exp(\\text{logits}_i)\\right)\\right), j = 1,...,K
\\loss_j= -\sum_{i=0}^{K}\text{label}_i\left(\text{logits}_i - \log\left(\sum_{i=0}^{K}\exp(\text{logits}_i)\right)\right), j = 1,...,K
3) If :attr:`numeric_stable_mode` is :attr:`True`, softmax is calculated first by:
.. math::
max_j &= \\max_{i=0}^{K}{\\text{logits}_i}
log\\_max\\_sum_j &= \\log\\sum_{i=0}^{K}\\exp(logits_i - max_j)
softmax_j &= \\exp(logits_j - max_j - {log\\_max\\_sum}_j)
\\max_j&=\max_{i=0}^{K}{\text{logits}_i} \\
log\_max\_sum_j &= \log\sum_{i=0}^{K}\exp(logits_i - max_j)\\
softmax_j &= \exp(logits_j - max_j - {log\_max\_sum}_j)
and then cross entropy loss is calculated by softmax and label.
Expand Down

0 comments on commit faf5ff3

Please sign in to comment.