Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Support ReduceLrUpdaterHook #860

Merged
merged 4 commits into from
May 12, 2022
Merged

Conversation

Yshuo-Li
Copy link
Collaborator

No description provided.

@codecov
Copy link

codecov bot commented Apr 22, 2022

Codecov Report

Merging #860 (6c6a715) into master (fa272ac) will increase coverage by 0.22%.
The diff coverage is 92.25%.

@@            Coverage Diff             @@
##           master     #860      +/-   ##
==========================================
+ Coverage   83.26%   83.48%   +0.22%     
==========================================
  Files         222      222              
  Lines       12669    12808     +139     
  Branches     2054     2093      +39     
==========================================
+ Hits        10549    10693     +144     
+ Misses       1795     1784      -11     
- Partials      325      331       +6     
Flag Coverage Δ
unittests 83.44% <92.25%> (+0.21%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmedit/core/scheduler/lr_updater.py 93.03% <92.08%> (+66.72%) ⬆️
mmedit/core/__init__.py 100.00% <100.00%> (ø)
mmedit/core/scheduler/__init__.py 100.00% <100.00%> (ø)
...ls/components/stylegan2/generator_discriminator.py 86.14% <0.00%> (+1.20%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update fa272ac...6c6a715. Read the comment docs.

lr_updater = LinearLrUpdaterHook(by_epoch=True)
lr_updater.get_lr(fake_runner, 1)
lr_updater.start = 10
lr_updater.get_lr(fake_runner, 1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to at least assert something in the test, e.g. if the learning rate reduces to a desired value.
Same for test_reduce_lr_updater_hook

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this another feature irrelevant to reducing on the plateau?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this another feature irrelevant to reducing on the plateau?

Yes, it is another feature irrelevant to reducing on the plateau, which is in lack of unit test

@wangruohui
Copy link
Member

We still need to assert the lr get from lr_updater.get_lr(fake_runner, 1) is expected. They are missing currently.

@wangruohui wangruohui merged commit 9a01375 into open-mmlab:master May 12, 2022
wangruohui pushed a commit to wangruohui/mmediting that referenced this pull request Jul 7, 2022
* [Feature] Support ReduceLrUpdaterHook

* Update

* Update unittest
Yshuo-Li added a commit to Yshuo-Li/mmediting that referenced this pull request Jul 15, 2022
* [Feature] Support ReduceLrUpdaterHook

* Update

* Update unittest
Yshuo-Li added a commit to Yshuo-Li/mmediting that referenced this pull request Jul 15, 2022
* [Feature] Support ReduceLrUpdaterHook

* Update

* Update unittest
Yshuo-Li added a commit to Yshuo-Li/mmediting that referenced this pull request Jul 16, 2022
* [Feature] Support ReduceLrUpdaterHook

* Update

* Update unittest
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants