Skip to content

Commit

Permalink
[LLM] fix lora target modules on llama (#8372)
Browse files Browse the repository at this point in the history
  • Loading branch information
SylarTiaNII authored May 7, 2024
1 parent ae0bea9 commit 9f3cf82
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions llm/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -125,9 +125,11 @@ def get_lora_target_modules(model):
".*v_proj.*",
".*k_proj.*",
".*o_proj.*",
".*qkv_proj.*",
".*gate_proj.*",
".*down_proj.*",
".*up_proj.*",
".*gate_up_fused_proj.*",
]
elif model.base_model_prefix == "opt":
target_modules = [
Expand Down

0 comments on commit 9f3cf82

Please sign in to comment.