Skip to content

Commit

Permalink
fix double backward for binary_cross_entropy loss function when `re…
Browse files Browse the repository at this point in the history
…duction=sum`. (pytorch#59479)

Summary:
Fixes pytorch#59477.

```python
In [1]: import torch

In [2]: x = torch.rand(3, 3, dtype=torch.double, requires_grad=True)

In [3]: y = torch.rand(3, 3, dtype=torch.double)

In [4]: torch.autograd.gradgradcheck(lambda x, y: torch.nn.functional.binary_cross_entropy(x, y, reduction='sum'), [x, y])
Out[4]: True

In [5]: torch.autograd.gradgradcheck(lambda x, y: torch.nn.functional.binary_cross_entropy(x, y, reduction='mean'), [x, y])
Out[5]: True

In [6]: torch.autograd.gradcheck(lambda x, y: torch.nn.functional.binary_cross_entropy(x, y, reduction='sum'), [x, y])
Out[6]: True

```

More comprehensive testing could be added in pytorch#59447 where explicit `gradcheck` and `gradgradcheck` tests are added.

Pull Request resolved: pytorch#59479

Reviewed By: ejguan

Differential Revision: D28934354

Pulled By: albanD

fbshipit-source-id: 12ce68e3c5c499b2531f7cdba3c22548d67e07e9
  • Loading branch information
nikitaved authored and facebook-github-bot committed Jun 7, 2021
1 parent 77dde35 commit a30b359
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions torch/csrc/autograd/FunctionsManual.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -1278,9 +1278,8 @@ Tensor binary_cross_entropy_double_backward(const Tensor & grad_output, const Te
}
if (reduction == at::Reduction::Mean) {
return gI / input.numel();
} else if (reduction == at::Reduction::Sum) {
return gI.sum();
}

return gI;
}

Expand Down

0 comments on commit a30b359

Please sign in to comment.