-
Notifications
You must be signed in to change notification settings - Fork 26.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: Cannot use apply_chat_template() because tokenizer.chat_template is not set #33246
Comments
Yes, you're right about the cause! Rather than trying to merge a proper chat template for Blenderbot (which is very obsolete by now), I'll just rewrite the doc to use a different model. |
@Rocketknight1 I'm getting the same error when I try to use some models like gemma. I can try to use the template parameter but not sure what the format is for gemma model (I can look it up in the tokenizer_config.json right?). Is this pretty much what we now have to do when we get this error? manually set the template? for models that dont accept |
Hi @PhilipAmadasun, the most likely cause is that you're loading the base gemma models, like |
Also @NielsRogge, fix has now been merged |
hi, I understand the current changes, but there's still a lot of code that actually uses the default chat template. i'd like to know what the default chat tempalte was before, so i can set the “chat template” in tokenizer_config. |
Hi @daidaiershidi, using the old 'default' chat template with models that were not trained with it will produce very inaccurate results! This is a big part of the reason the default templates were removed. Can you tell me which models you're working with that used to have default chat templates? It's possible that some of them actually should have templates, in which case we can add them. |
the wizard family, none of their tokenizer_config have chat_template (https://huggingface.co/WizardLMTeam/WizardLM-13B-V1.2/blob/main/tokenizer_config.json), and others like https://huggingface.co/layoric/llama-2-13b-code-alpaca/blob/main/tokenizer_config.json |
THIS. I am running a small LLM on a small machine, just to test that the code, API works. I do not care about correct responses here. Let us run the code by forcing some template! |
System Info
Transformers v4.45.0.dev0
Who can help?
@Rocketknight1
Reproduction
The code snippet from here doesn't seem to work, I assume this is because models no longer have a default chat template if they don't have it set:
results in
Expected behavior
A working code snippet
The text was updated successfully, but these errors were encountered: