Skip to content

LearningRateMonitor doesn't reflect LearningRateFinder changes #15856

Open
@davidgilbertson

Description

@davidgilbertson

Bug description

It seems that when LearningRateFinder changes the LR of my model's optimizer, this isn't reflected in the values logged by LearningRateMonitor.

I think the offending line of code is here: https://github.com/Lightning-AI/lightning/blob/master/src/pytorch_lightning/callbacks/lr_monitor.py#L187-L191 as at version 1.8.3.post1

Where trainer.optimizers returns the optimizer with the original LR value, but trainer.model.optimizers() has the updated value.

How to reproduce the bug

No response

More info

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions