-
Notifications
You must be signed in to change notification settings - Fork 6.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
intuition behind the loss functions loss_idt_A and loss_idt_B #322
Comments
I think the identity loss is used to preserve the color and prevent reverse color in the result. |
@devraj89 This is a great question. For your questions:
|
This was referenced Sep 9, 2018
How would the identity loss fit when both generators have different input and output channels, e.g. image colorization? The output and input shape should match for the generator while calculating the loss. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi
Thank you for posting this wonderful code but I am wondering what is the intuition behind the two losses
loss_idt_A
andloss_idt_B
mentioned in thecycle_gan_model.py
file? By reading through the implementation it seems like the loss is supposed to discourage the generator to translate the image in case it is already in the correct domain. Like if the image is indomain B
thenG_A
should act as identity and not try to translate it?Though I understand the intuition behind this loss, I have several questions pertaining to it
[1] why exactly is the loss relevant? since it is a controlled training setup where we know the images are coming from which domain, why would we send
domain B
images throughG_A
?[2] Is this loss relevant to the testing time when the domain of the image is unknown?
[3] Is the loss mentioned anywhere in the paper?
[4] Is the loss helpful in generating the images? has any benchmarking been done for this?
Thanks again for the code! Hoping to get the doubts cleared soon!
Devraj
The text was updated successfully, but these errors were encountered: