-
Notifications
You must be signed in to change notification settings - Fork 740
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Are there any available tools that can convert the original .pth to safetensors #191
Comments
Can you share the errors you are seeing? An alternative is to also download the corresponding safetensor versions that we upload to huggingface as well. |
huggingface/transformers#33791 python3 /home/transformers/src/transformers/models/llama/convert_llama_weights_to_hf.py You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama_fast.LlamaTokenizerFast'>. This is expected, and simply means that the legacy (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set legacy=False. This should only be set if you understand what it means, and thoroughly read the reason why this was added as explained in huggingface/transformers#24565 - if you loaded a llama tokenizer from a GGUF file you can ignore this message. |
Are there any available tools that can convert the original .pth model files downloaded from Meta into a format usable by stack, or convert them to .safetensors format? I tried the tool from https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama/convert_llama_weights_to_hf.py, but it threw an error during execution.
The text was updated successfully, but these errors were encountered: