-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Refactory] Merge BEiT and ConvNext 's LR decay optimizer constructors #1438
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1438 +/- ##
==========================================
+ Coverage 90.30% 90.34% +0.03%
==========================================
Files 140 140
Lines 8345 8327 -18
Branches 1403 1400 -3
==========================================
- Hits 7536 7523 -13
+ Misses 571 569 -2
+ Partials 238 235 -3
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
|
||
|
||
@OPTIMIZER_BUILDERS.register_module() | ||
class LayerDecayOptimizerConstructor(DefaultOptimizerConstructor): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we merge LayerDecayOptimizerConstructor
and LearningRateDecayOptimizerConstructor
to one class?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
must be add a deprecated warning for the deleted one
I suggest split Because the diff shows the difference between original two Btw, it is absolutely excellent if you can merge these two |
I suggest we may remind users know what are these two optimizer construtors for. Because based on their name, users may not know relation between them. Related comments are flexible so you name it . |
open-mmlab#1438) * move layer_decay_optimizer_constructor * fix * fix * merge test_core * fix * add DeprecationWarning * fix DeprecationWarning * fix * fix * fix * fix * fix * fix * fix * fix * fix * fix * fix * fix * fix test * fix * fix * fix * fix * fix ut * fix * fix * Update tests/test_core/test_layer_decay_optimizer_constructor.py * fix * fix * fix * fix Co-authored-by: MeowZheng <meowzheng@outlook.com> Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>
Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.
Motivation
As BEiT and ConvNext 's LR decay optimizer constructors have similar functions, the difference between them is how to get layer/stage id and we merge them in this pr.
Modification
get_num_layer_layer_wise
get_num_stage_stage_wist
layer_wise
LayerDecayOptimizerConstructor