Skip to content

stream should only accept type Boolean when using OpenAI API Server spec #1273

@xvnyv

Description

@xvnyv

The current behaviour of vLLM does not match the behaviour of OpenAI and Azure OpenAI when it comes to the stream parameter in the request body.

Current behaviour of OpenAI and Azure OpenAI:

  • Only "stream": true or "stream": false are accepted. Setting "stream": "true" or "stream": "false" (or any other non-Boolean values) will raise the following error:
{
  "error": {
    "message": "'false' is not of type 'boolean' - 'stream'",
    "type": "invalid_request_error",
    "param": null,
    "code": null
  }
}

Current behaviour of vLLM:

  • The following values for the stream request body parameter are accepted by vLLM: true, "true", false, "false"
  • Any other values will raise the following error:
{
    "object": "error",
    "message": "[{'loc': ('body', 'stream'), 'msg': 'value could not be parsed to a boolean', 'type': 'type_error.bool'}]",
    "type": "invalid_request_error",
    "param": null,
    "code": null
}
  • It seems like this is caused by the use of Pydantic with stream variable set to type bool instead of StrictBool (source code)

May I know if there is an agreement that vLLM should also reject the request when stream is set to the string values of "true" and "false"? I can prepare a PR to make the change if that is the desired behaviour.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions