Skip to content

Commit 8f59d79

Browse files
committed
update doc for sigmoid_cross_entropy_with_logits
1 parent 5b50307 commit 8f59d79

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

paddle/fluid/operators/sigmoid_cross_entropy_with_logits_op.cc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -113,14 +113,14 @@ The logistic loss is given as follows:
113113
114114
$$loss = -Labels * \log(\sigma(X)) - (1 - Labels) * \log(1 - \sigma(X))$$
115115
116-
We know that $$\sigma(X) = (1 / (1 + \exp(-X)))$$. By substituting this we get:
116+
We know that $$\sigma(X) = \\frac{1}{1 + \exp(-X)}$$. By substituting this we get:
117117
118118
$$loss = X - X * Labels + \log(1 + \exp(-X))$$
119119
120120
For stability and to prevent overflow of $$\exp(-X)$$ when X < 0,
121121
we reformulate the loss as follows:
122122
123-
$$loss = \max(X, 0) - X * Labels + \log(1 + \exp(-|X|))$$
123+
$$loss = \max(X, 0) - X * Labels + \log(1 + \exp(-\|X\|))$$
124124
125125
Both the input `X` and `Labels` can carry the LoD (Level of Details) information.
126126
However the output only shares the LoD with input `X`.

0 commit comments

Comments
 (0)