-
Notifications
You must be signed in to change notification settings - Fork 31.5k
Closed
Labels
Description
System Info
transformers == 4.57.2
vllm == 0.11.0
Who can help?
No response
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - My own task or dataset (give details below)
Reproduction
when loading Qwen/Qwen3-235B-A22B-Thinking-2507 with vllm.entrypoints.openai.api_server, it's failed with
1849
File "/root/vllm_env/lib/python3.10/site-packages/vllm/transformers_utils/tokenizer.py", line 217, in get_tokenizer
1850
tokenizer = AutoTokenizer.from_pretrained(
1851
File "/root/vllm_env/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 1156, in from_pretrained
1852
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
1853
File "/root/vllm_env/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2112, in from_pretrained
1854
return cls._from_pretrained(
1855
File "/root/vllm_env/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2419, in _from_pretrained
1856
if _is_local and _config.model_type not in [
1857
AttributeError: 'dict' object has no attribute 'model_type'
Expected behavior
_config.model_type -> _config.get("model_type")
lyc2333, ahmd-k, GetUpEarlier and WangXiang1219-coderedeggandsky