-
-
Notifications
You must be signed in to change notification settings - Fork 654
Closed
Labels
Description
🚀 Feature
Currently, the correct way to use LRScheduler
wrapper for pytorch >= 1.1.0 is the following:
from torch.optim.lr_scheduler import StepLR
torch_lr_scheduler = StepLR(optimizer, step_size=3, gamma=0.1)
scheduler = LRScheduler(torch_lr_scheduler)
@trainer.on(Events.ITERATION_COMPLETED)
def print_lr():
print(optimizer.param_groups[0]["lr"])
# In this example, we assume to have installed PyTorch>=1.1.0
# (with new `torch.optim.lr_scheduler` behaviour) and
# we attach scheduler to Events.ITERATION_COMPLETED
# instead of Events.ITERATION_STARTED to make sure to use
# the first lr value from the optimizer, otherwise it is will be skipped:
trainer.add_event_handler(Events.ITERATION_COMPLETED, scheduler)
trainer..run([0] * 8, max_epochs=1)
0.1
0.1
0.1
0.010
0.010
0.010
0.001
0.001
however, other schedulers should be used as following (link)
milestones_values = [(1, 1.0), (3, 0.8), (5, 0.2)]
scheduler = PiecewiseLinear(
optimizer, "lr", milestones_values=milestones_values)
trainer.add_event_handler(Events.ITERATION_STARTED, scheduler)
@trainer.on(Events.ITERATION_COMPLETED)
def print_lr():
print(optimizer.param_groups[0]["lr"])
trainer.run([0] * 6, max_epochs=1)
The idea is to improve LRScheduler
such that we could attach it to Events.ITERATION_STARTED
and have a coherent API. It will be a BC-breaking change, but for good.
So, desired example using LRScheduler
should be:
from torch.optim.lr_scheduler import StepLR
torch_lr_scheduler = StepLR(optimizer, step_size=3, gamma=0.1)
scheduler = LRScheduler(torch_lr_scheduler)
trainer.add_event_handler(Events.ITERATION_STARTED, scheduler)
@trainer.on(Events.ITERATION_COMPLETED)
def print_lr():
print(optimizer.param_groups[0]["lr"])
trainer.run([0] * 8, max_epochs=1)
Currently, this gives a wrong behaviour as the first 0.1 wasn't consumed by the training step.
0.1
0.1
0.010
0.010
0.010
0.001
0.001
The idea could be to retain the first value and reapply it once and then keep everything as it is now.
sdesrozis