-
Notifications
You must be signed in to change notification settings - Fork 137
Open
Description
if model_name.startswith("hosted_vllm/") and (
"localhost" in host or host == "0.0.0.0" or host == "127.0.0.1"
):
源代码都有这个判断,hosted_vllm 只是文件夹的名字吧,这样判断是不是太绝对了?,直接传路径下面报错
litellm.exceptions.BadRequestError: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=/QWEN2/Qwen2.5-VL-7B-Instruct-AWQ
Pass model as E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/starcoder',..)
Metadata
Metadata
Assignees
Labels
No labels