Skip to content

Commit

Permalink
[Misc]Fix BitAndBytes exception messages (vllm-project#7626)
Browse files Browse the repository at this point in the history
  • Loading branch information
jeejeelee authored and omrishiv committed Aug 26, 2024
1 parent d87ddc2 commit 6737fba
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions vllm/model_executor/model_loader/loader.py
Original file line number Diff line number Diff line change
Expand Up @@ -883,11 +883,11 @@ def _load_weights(self, model_config: ModelConfig,
if not hasattr(model, 'load_weights'):
raise AttributeError(
"The required method 'load_weights' is not defined in class"
f" {type(self).__name__}.")
f" {type(model).__name__}.")

if not hasattr(model, 'bitsandbytes_stacked_params_mapping'):
raise AttributeError(
f"Model {type(self).__name__} does not support BitsAndBytes "
f"Model {type(model).__name__} does not support BitsAndBytes "
"quantization yet.")

logger.info("Loading weights with BitsAndBytes quantization. "
Expand Down

0 comments on commit 6737fba

Please sign in to comment.