Server UI bug: corrupted generation #9836
Labels
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
server/webui
server
stale
What happened?
Server somehow corrupted the prompt, so tokens at the end of the every line are lost.
Here is how I run server:
./build/bin/llama-server -m ~/Downloads/qwen2.5-7b-instruct-q4_0-00001-of-00002.gguf
Here is how I test CLI to ensure it is a server bug:
Here is the output from the CLI
Here is how I test server endpoints to ensure this is a UI bug:
Response from the endpoints are valid:
Here are screenshots:
Old web UI
New web UI
New web UI Chat
SimpleChat
llama-cli
What is affected:
Unaffected:
Name and Version
version: 3891 (d5cb868)
built with cc (Debian 12.2.0-14) 12.2.0 for x86_64-linux-gnu
What operating system are you seeing the problem on?
No response
Relevant log output
No response
The text was updated successfully, but these errors were encountered: