-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multiple optimizers support in training_epoch_end #1397
Comments
@asafmanor pls may you check it? |
@williamFalcon should we support this? |
This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team! |
This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team! |
🐛 Bug
training_epoch_end
only gets the return values fromtraining_step
called for the last optimizer each batch.See:
https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/trainer/training_loop.py#L603
https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/trainer/training_loop.py#L657
To Reproduce
Create a
LightningModule
with two optimizers. Try to access bothtraining_step
return values in thetraining_epoch_end
.Code sample
(trivial)
Expected behavior
I'd like to have access to both as ie. a list of tuples or a tuple of lists. I'd probably prefer a tuple of lists, ie. I get a separate list of return values for each of the optimizers.
Environment
Additional context
The text was updated successfully, but these errors were encountered: