-
Notifications
You must be signed in to change notification settings - Fork 383
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] CodeGen 2.5 Tokenizer cannot be initialized anymore #94
Comments
Seems to be an issue with |
any fix for this I'm having trouble quantizing the model into gguf format |
Only workaround I found for now is https://onetwobytes.com/2024/10/07/codegen2-5-llm-not-working-with-latest-huggingface-transformers/ |
@AlEscher I tried converting it to ggml so that I can quantize it and convert it to gguf but I'm still having trouble. did you manage to quantize it or are you running the full weights? |
@ahmedashraf443 I am running full weights. By installing the specified |
Yeah i thought so . Sadly i can't run the full weights on my laptop and i tried quantuzing the model but it never works. Guess ill have to stick toqwen25.5-coder |
The code from https://huggingface.co/Salesforce/codegen25-7b-multi_P#causal-sampling-code-autocompletion and https://github.com/salesforce/CodeGen/tree/main/codegen25#sampling does not work currently.
Creating the tokenizer like
gives an error:
I have installed
tiktoken==0.8.0
as installation oftiktoken==0.4.0
via pip fails.The text was updated successfully, but these errors were encountered: