Skip to content
This repository was archived by the owner on Oct 11, 2024. It is now read-only.

Commit eb28215

Browse files
xiangxu-googlealexm-redhat
authored andcommitted
Use head_dim in config if exists (vllm-project#2622)
1 parent 1d6bdb8 commit eb28215

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

vllm/config.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -248,6 +248,8 @@ def get_hidden_size(self) -> int:
248248
return self.hf_config.hidden_size
249249

250250
def get_head_size(self) -> int:
251+
if hasattr(self.hf_config, "head_dim"):
252+
return self.hf_config.head_dim
251253
# FIXME(woosuk): This may not be true for all models.
252254
return self.hf_config.hidden_size // self.hf_config.num_attention_heads
253255

0 commit comments

Comments
 (0)