Bug: llama-server
web UI resets the text selection during inference on every token update
#9608
Labels
bug
Something isn't working
good first issue
Good for newcomers
help wanted
Extra attention is needed
low severity
Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
What happened?
When using
llama-server
, the output in the UI can't be easily selected or copied until after text generation stops. This may be because the script replaces all the DOM nodes of the current generation when every new token is output.The existing text content ideally shouldn't be replaced during generation so we can copy the text as it continues to produce output.
Name and Version
version: 3755 (822b632)
built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu
What operating system are you seeing the problem on?
No response
Relevant log output
No response
The text was updated successfully, but these errors were encountered: