Skip to content

Commit

Permalink
Add some LoRA params
Browse files Browse the repository at this point in the history
  • Loading branch information
oobabooga committed Mar 17, 2023
1 parent 9ed2c45 commit 9256e93
Showing 1 changed file with 5 additions and 1 deletion.
6 changes: 5 additions & 1 deletion modules/LoRA.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,8 @@ def add_lora_to_model(lora_name):
else:
# Why doesn't this work in 16-bit mode?
print(f"Adding the LoRA {lora_name} to the model...")
shared.model = PeftModel.from_pretrained(shared.model, Path(f"loras/{lora_name}"))

params = {}
#params['device_map'] = {'': 0}
#params['dtype'] = shared.model.dtype
shared.model = PeftModel.from_pretrained(shared.model, Path(f"loras/{lora_name}"), **params)

0 comments on commit 9256e93

Please sign in to comment.