Open
Description
Bug description
It seems that when LearningRateFinder
changes the LR of my model's optimizer, this isn't reflected in the values logged by LearningRateMonitor
.
I think the offending line of code is here: https://github.com/Lightning-AI/lightning/blob/master/src/pytorch_lightning/callbacks/lr_monitor.py#L187-L191 as at version 1.8.3.post1
Where trainer.optimizers
returns the optimizer with the original LR value, but trainer.model.optimizers()
has the updated value.
How to reproduce the bug
No response
More info
No response