Skip to content

Commit 68b90d9

Browse files
committed
Attempt to fix eagle. After vllm-project#18781 speculative_config.draft_model_config.hf_config.model_type will get overwritten to the main model type
Signed-off-by: Gregory Shtrasberg <Gregory.Shtrasberg@amd.com>
1 parent 7f21e80 commit 68b90d9

File tree

2 files changed

+4
-2
lines changed

2 files changed

+4
-2
lines changed

vllm/model_executor/models/eagle.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -249,7 +249,9 @@ def load_weights(self, weights: Iterable[tuple[str, torch.Tensor]]):
249249
lm_head_weight = torch.zeros(
250250
self.lm_head.org_vocab_size,
251251
self.lm_head.embedding_dim,
252-
dtype=self.config.torch_dtype,
252+
dtype=getattr(torch, self.config.torch_dtype)
253+
if type(self.config.torch_dtype) is str else
254+
self.config.torch_dtype,
253255
)
254256

255257
weight_loader = getattr(self.lm_head.weight, "weight_loader",

vllm/transformers_utils/configs/eagle.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ def __init__(self,
7070

7171
if self.model is not None:
7272
for k, v in self.model.to_dict().items():
73-
if k not in kwargs:
73+
if k not in kwargs and not hasattr(self, k):
7474
setattr(self, k, v)
7575

7676
@classmethod

0 commit comments

Comments
 (0)