We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No response
通过案例,对模型进行微调,依据给的demo进行参数合并,合并后的模型在加载推理的时候,报Unrecognized configuration class to build an AutoTokenizer错误
执行合并程序如下: from peft import AutoPeftModelForCausalLM
model = AutoPeftModelForCausalLM.from_pretrained( path_to_adapter, # path to the output directory device_map="auto", trust_remote_code=True ).eval()
merged_model = model.merge_and_unload() merged_model.save_pretrained(new_model_directory, max_shard_size="2048MB", safe_serialization=True)
无
The text was updated successfully, but these errors were encountered:
No branches or pull requests
起始日期 | Start Date
No response
实现PR | Implementation PR
No response
相关Issues | Reference Issues
通过案例,对模型进行微调,依据给的demo进行参数合并,合并后的模型在加载推理的时候,报Unrecognized configuration class to build an AutoTokenizer错误
摘要 | Summary
执行合并程序如下:
from peft import AutoPeftModelForCausalLM
model = AutoPeftModelForCausalLM.from_pretrained(
path_to_adapter, # path to the output directory
device_map="auto",
trust_remote_code=True
).eval()
merged_model = model.merge_and_unload()
merged_model.save_pretrained(new_model_directory, max_shard_size="2048MB", safe_serialization=True)
基本示例 | Basic Example
无
缺陷 | Drawbacks
未解决问题 | Unresolved questions
No response
The text was updated successfully, but these errors were encountered: