-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bug: responses from /chat/completions endpoint contain a leading space in the content #2548
Comments
ggerganov/llama.cpp#3664 might be related? (would mean it's in nitro.exe which uses llama.cpp) |
Reading the 2 issues above plus ggerganov/llama.cpp#4081 the leading space appears to be added during tokenization on purpose and is even needed for some models to work correctly. |
Without going further into the rabbit hole of how tokenization works internally and whether it applies to completion.... |
hi @Propheticus, dev team resolved the issue, would you mind retrying it? many thanks 🙏 |
Looks good to me @Van-QA 👍 |
Jan's API server responds with a leading space. This leads to broken output (markdown tables don't render right) and illegal file names when the output is used to generate note titles which are in turn used as the .md filename.
Call:
Response:
Expected output:
Tested with "stream": false as well and the same is true for un-chunked chat.completion objects.
The text was updated successfully, but these errors were encountered: