Skip to content

[Bug]: MiniCPM-o int4 #17358

Closed
Closed
@JIA-HONG-CHU

Description

@JIA-HONG-CHU

Your current environment

jetson orin 64G
vllm 0.8.5

🐛 Describe the bug

When i load MiniCPM-o int4 using "dtype": "float16", and "quantization": "gptq". I encounted

File "/usr/local/lib/python3.10/dist-packages/vllm/model_executor/model_loader/loader.py", line 455, in load_model
    loaded_weights = model.load_weights(
  File "/usr/local/lib/python3.10/dist-packages/vllm/model_executor/models/minicpmo.py", line 531, in load_weights
    return loader.load_weights(weights)
  File "/usr/local/lib/python3.10/dist-packages/vllm/model_executor/models/utils.py", line 261, in load_weights
    autoloaded_weights = set(self._load_module("", self.module, weights))
  File "/usr/local/lib/python3.10/dist-packages/vllm/model_executor/models/utils.py", line 222, in _load_module
    yield from self._load_module(prefix,
  File "/usr/local/lib/python3.10/dist-packages/vllm/model_executor/models/utils.py", line 222, in _load_module
    yield from self._load_module(prefix,
  File "/usr/local/lib/python3.10/dist-packages/vllm/model_executor/models/utils.py", line 250, in _load_module
    raise ValueError(msg)
ValueError: There is no module or parameter named 'resampler.kv_proj.weight' in MiniCPMO

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions