Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
fix double backward for
binary_cross_entropy
loss function when `re…
…duction=sum`. (pytorch#59479) Summary: Fixes pytorch#59477. ```python In [1]: import torch In [2]: x = torch.rand(3, 3, dtype=torch.double, requires_grad=True) In [3]: y = torch.rand(3, 3, dtype=torch.double) In [4]: torch.autograd.gradgradcheck(lambda x, y: torch.nn.functional.binary_cross_entropy(x, y, reduction='sum'), [x, y]) Out[4]: True In [5]: torch.autograd.gradgradcheck(lambda x, y: torch.nn.functional.binary_cross_entropy(x, y, reduction='mean'), [x, y]) Out[5]: True In [6]: torch.autograd.gradcheck(lambda x, y: torch.nn.functional.binary_cross_entropy(x, y, reduction='sum'), [x, y]) Out[6]: True ``` More comprehensive testing could be added in pytorch#59447 where explicit `gradcheck` and `gradgradcheck` tests are added. Pull Request resolved: pytorch#59479 Reviewed By: ejguan Differential Revision: D28934354 Pulled By: albanD fbshipit-source-id: 12ce68e3c5c499b2531f7cdba3c22548d67e07e9
- Loading branch information