Skip to content

Commit

Permalink
只对finetune的测试开启ignore_save_lr_and_optim
Browse files Browse the repository at this point in the history
  • Loading branch information
JunnYu committed Feb 21, 2024
1 parent f03d61f commit 71fe5d5
Show file tree
Hide file tree
Showing 4 changed files with 0 additions and 4 deletions.
1 change: 0 additions & 1 deletion tests/fixtures/llm/lora.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@ lora:
save_total_limit: 1
tensor_parallel_degree: 1
pipeline_parallel_degree: 1
ignore_save_lr_and_optim: 1
lora: true

default:
Expand Down
1 change: 0 additions & 1 deletion tests/fixtures/llm/prefix_tuning.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@ prefix_tuning:
save_total_limit: 1
tensor_parallel_degree: 1
pipeline_parallel_degree: 1
ignore_save_lr_and_optim: 1
prefix_tuning: true

default:
Expand Down
1 change: 0 additions & 1 deletion tests/fixtures/llm/pretrain.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@ pretrain:
use_flash_attention: 0
use_fused_rms_norm: 0
continue_training: 1
ignore_save_lr_and_optim: 1

default:
llama:
Expand Down
1 change: 0 additions & 1 deletion tests/fixtures/llm/ptq.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@ ptq:
eval_with_do_generation: false
do_ptq: true
ptq_step: 4
ignore_save_lr_and_optim: 1

default:
llama:
Expand Down

0 comments on commit 71fe5d5

Please sign in to comment.