You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi
I have trained cycleGan on my own dataset. During the training, I found that the GAN loss of Generator A cannot be convergent and becomes larger. At the same time, the GAN loss of Generator B, Discriminator A and B all go convergent. Have anyone met this problem before? Thank you.
It is quite normal. It is fine as long as the losses do not explode. We are using LSGAN and there is no guarantee that the GAN loss will converge. You can get a convergent loss if you are using WGAN(or WGAN-GP).
No description provided.
The text was updated successfully, but these errors were encountered: