Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add first Step in LR Schedulers #6597

Merged
merged 9 commits into from
Oct 14, 2024
Merged

Add first Step in LR Schedulers #6597

merged 9 commits into from
Oct 14, 2024

Conversation

jomayeri
Copy link
Contributor

@jomayeri jomayeri commented Oct 1, 2024

Some (not all) of the LR schedulers in runtime were missing the initialization of the optimizer group lr.

@jomayeri jomayeri requested a review from loadams as a code owner October 2, 2024 21:23
@loadams loadams requested a review from tohtana as a code owner October 8, 2024 15:51
@@ -675,11 +675,16 @@ def __init__(self,
self.warmup_type = warmup_type
self.inverse_log_warm_up = 1.0 / math.log(self.warmup_num_steps)
self.last_batch_iteration = last_batch_iteration
# Initialize lr in optimizer
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These changes are duplicated. Can we extract the code as a function and reuse it?

@@ -851,15 +851,17 @@ def get_lr_ratio(self):
ratio = max(0.0, self.cos_min_ratio + ratio_delta * ratio)
return ratio

def update_lr(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we also consolidate this with the previous one?

@tohtana tohtana enabled auto-merge October 14, 2024 18:02
@tohtana tohtana added this pull request to the merge queue Oct 14, 2024
Merged via the queue into master with commit 85b7469 Oct 14, 2024
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants