Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: llama-server web UI resets the text selection during inference on every token update #9608

Open
mashdragon opened this issue Sep 23, 2024 · 2 comments
Labels
bug Something isn't working good first issue Good for newcomers help wanted Extra attention is needed low severity Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)

Comments

@mashdragon
Copy link

mashdragon commented Sep 23, 2024

What happened?

When using llama-server, the output in the UI can't be easily selected or copied until after text generation stops. This may be because the script replaces all the DOM nodes of the current generation when every new token is output.

The existing text content ideally shouldn't be replaced during generation so we can copy the text as it continues to produce output.

Name and Version

version: 3755 (822b632)
built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu

What operating system are you seeing the problem on?

No response

Relevant log output

No response

@mashdragon mashdragon added bug-unconfirmed low severity Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches) labels Sep 23, 2024
@github-actions github-actions bot added the stale label Oct 24, 2024
Copy link
Contributor

github-actions bot commented Nov 7, 2024

This issue was closed because it has been inactive for 14 days since being marked as stale.

@github-actions github-actions bot closed this as completed Nov 7, 2024
@mashdragon
Copy link
Author

This is still an issue which impacts usability.

@slaren slaren reopened this Nov 10, 2024
@slaren slaren added bug Something isn't working help wanted Extra attention is needed good first issue Good for newcomers and removed bug-unconfirmed stale labels Nov 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working good first issue Good for newcomers help wanted Extra attention is needed low severity Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)
Projects
None yet
Development

No branches or pull requests

2 participants