Skip to content

如何传本地的模型路径? #48

@cqray1990

Description

@cqray1990
if model_name.startswith("hosted_vllm/") and (
    "localhost" in host or host == "0.0.0.0" or host == "127.0.0.1"
):

源代码都有这个判断,hosted_vllm 只是文件夹的名字吧,这样判断是不是太绝对了?,直接传路径下面报错

litellm.exceptions.BadRequestError: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=/QWEN2/Qwen2.5-VL-7B-Instruct-AWQ
Pass model as E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/starcoder',..)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions