You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@marco-rudolph well, flows can minimize KL-divergence up to the constant or, alternatively, can have log-likelihoods larger than 0.0 which is not desirable. So, to prevent this and to avoid negative loss I added sigmoid. In practice, it can help to improve performance a bit.
Why are likelihoods larger than 0.0 not desireable? Negative loss itself should not be a problem as long as it does not lead to instabilities which is not really the case here.
But it is an interesting finding that this improves the performance :)
Hi Denis!
According to your code, you use some "negative log sigmoid log likelihood loss":
cflow-ad/train.py
Line 76 in 4d6ec47
cflow-ad/train.py
Line 78 in 4d6ec47
with
cflow-ad/train.py
Line 18 in 4d6ec47
What is the motivation behind using this kind of loss?
Thanks in advance,
Marco
The text was updated successfully, but these errors were encountered: