Skip to content

Commit

Permalink
Clarify log warning for Wasserstein Loss
Browse files Browse the repository at this point in the history
  • Loading branch information
KonstiNik committed Apr 24, 2024
1 parent e854df7 commit da01395
Showing 1 changed file with 6 additions and 1 deletion.
7 changes: 6 additions & 1 deletion znnl/loss_functions/wasserstein_loss.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,12 @@ def __init__(self):
"""
Constructor for the Wasserstein loss class.
"""
logger.warning("The Wasserstein loss cannot be used for gradient calculations.")
logger.warning(
"The Wasserstein loss cannot be used for gradient calculations, "
"and therefore not for neural network training. "
"It can be used to compare two distributions, however, the current "
"implementation is not differentiable!"
)

super(WassersteinLoss, self).__init__()
self.metric = WassersteinDistance()

0 comments on commit da01395

Please sign in to comment.