Closed
Description
Your current environment
vLLM latest commit as of time of post: commit 943ffa57032d62c21610e9cebffbdbe6c5c886ca
🐛 Describe the bug
Latest mistral reasoning models, ie: mistralai/Magistral-Small-2506
requires v11 from mistral_common, which needs version mistral_common>=1.6.0
. Otherwise the following error occurs when loading the model in vLLM:
File "/usr/local/lib/python3.12/dist-packages/vllm/transformers_utils/tokenizer.py", line 222, in get_tokenizer
tokenizer = MistralTokenizer.from_pretrained(str(tokenizer_name),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/vllm/transformers_utils/tokenizers/mistral.py", line 241, in from_pretrained
mistral_tokenizer = PublicMistralTokenizer.from_file(tokenizer_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/mistral_common/tokens/tokenizers/mistral.py", line 184, in from_file
tokenizer = Tekkenizer.from_file(tokenizer_filename)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/mistral_common/tokens/tokenizers/tekken.py", line 151, in from_file
raise ValueError(
ValueError: Unknown version: v11 in /root/.cache/huggingface/hub/models--mistralai--Magistral-Small-2506/snapshots/48c97929837c3189cb3cf74b1b5bc5824eef5fcc/tekken.json. Make sure to use a valid version string: ['v1', 'v2', 'v3', 'v7']
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.