Description
System Info
Hi,
I have downloaded llama 2 weights and installed the transformer package. I plan to use it under transformer package and applied the conversion script.
The conversion script does not work:
python src/transformers/models/llama/convert_llama_weights_to_hf.py
--input_dir /path/to/downloaded/llama/weights --model_size 7B --output_dir /output/path/tomyfilepath
File "...path/src/transformers/models/llama/convert_llama_weights_to_hf.py", line 126
print(f"Fetching all parameters from the checkpoint at {input_base_path}.")
^
SyntaxError: invalid syntax
On Linux when I do for example:
ls /path/to/downloaded/llama/llama-2-7b-chat
I get:
checklist.chk consolidated.00.pth params.json
I assume I have the correct files. Any advise would be grateful.
Who can help?
No response
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examples
folder (such as GLUE/SQuAD, ...) - My own task or dataset (give details below)
Reproduction
python src/transformers/models/llama/convert_llama_weights_to_hf.py
--input_dir /path/to/downloaded/llama/weights --model_size 7B --output_dir /output/path/tomyfilepath
Expected behavior
It is expected tokenizer and model be converted so that they are usable for transformer package.