Skip to content

Llama.cpp server doesn't return grammar error messages when in streaming mode #7391

Closed
@richardanaya

Description

@richardanaya

When you run a streaming request against llama.cpp there appears to be no way to see error messages in the http response related to grammar errors.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions