Skip to content

关于7b模型和lora模型合并的问题 #543

@ericzfguo

Description

@ericzfguo

作者您好,想请教一下在使用merge_peft_adapter.py代码合并预训练模型7b和训练好的lora模型时,参数--model_type、--tokenizer_path和--resize_emb应该怎样设置才正确,谢谢您!

merge

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions