Closed
Description
Prerequisites
- I checked to make sure that this feature has not been requested already.
1. The entire URL of the file you are using
models/official/nlp/optimization.py
Lines 68 to 107 in 7ecbac3
2. Describe the feature you request
Similar to beta_1
as a tunable parameter for creating optimizers, we can add beta_2
, epsilon
, weight_decay_rate
, and exclude_from_weight_decay
as tunable parameters by passing them as argument from create_optimizer
.
3. Additional context
I was recently trying to finetune a Huggingface Roberta model and while doing so, I wanted to add a scheduler as well as AdamW with custom parameters, and thus I came across these methods.
4. Are you willing to contribute it? (Yes or No)
Yes
Activity