-
-
Notifications
You must be signed in to change notification settings - Fork 10.2k
Closed
Description
The current behaviour of vLLM does not match the behaviour of OpenAI and Azure OpenAI when it comes to the stream
parameter in the request body.
Current behaviour of OpenAI and Azure OpenAI:
- Only
"stream": true
or"stream": false
are accepted. Setting"stream": "true"
or"stream": "false"
(or any other non-Boolean values) will raise the following error:
{
"error": {
"message": "'false' is not of type 'boolean' - 'stream'",
"type": "invalid_request_error",
"param": null,
"code": null
}
}
Current behaviour of vLLM:
- The following values for the
stream
request body parameter are accepted by vLLM:true
,"true"
,false
,"false"
- Any other values will raise the following error:
{
"object": "error",
"message": "[{'loc': ('body', 'stream'), 'msg': 'value could not be parsed to a boolean', 'type': 'type_error.bool'}]",
"type": "invalid_request_error",
"param": null,
"code": null
}
- It seems like this is caused by the use of Pydantic with
stream
variable set to typebool
instead ofStrictBool
(source code)
May I know if there is an agreement that vLLM should also reject the request when stream
is set to the string values of "true"
and "false"
? I can prepare a PR to make the change if that is the desired behaviour.
lizzzcai
Metadata
Metadata
Assignees
Labels
No labels