Skip to content

Commit

Permalink
link for optimizer names (huggingface#32400)
Browse files Browse the repository at this point in the history
* link for optimizer names

Add a note and link to where the user can find more optimizer names easily because there are many more optimizers than are mentioned in the docstring.

* make fixup
  • Loading branch information
nbroad1881 authored and Titus-von-Koeller committed Aug 21, 2024
1 parent 248eee8 commit 03a2d23
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions src/transformers/training_args.py
Original file line number Diff line number Diff line change
Expand Up @@ -611,8 +611,9 @@ class TrainingArguments:
The options should be separated by whitespaces.
optim (`str` or [`training_args.OptimizerNames`], *optional*, defaults to `"adamw_torch"`):
The optimizer to use: adamw_hf, adamw_torch, adamw_torch_fused, adamw_apex_fused, adamw_anyprecision or
adafactor.
The optimizer to use, such as "adamw_hf", "adamw_torch", "adamw_torch_fused", "adamw_apex_fused", "adamw_anyprecision",
"adafactor". See `OptimizerNames` in [training_args.py](https://github.com/huggingface/transformers/blob/main/src/transformers/training_args.py)
for a full list of optimizers.
optim_args (`str`, *optional*):
Optional arguments that are supplied to AnyPrecisionAdamW.
group_by_length (`bool`, *optional*, defaults to `False`):
Expand Down

0 comments on commit 03a2d23

Please sign in to comment.