Closed
Description
Name and Version
version: 4807 (72fc3916)
built with Intel(R) oneAPI DPC++/C++ Compiler 2025.0.4 (2025.0.4.20241205) for x86_64-unknown-linux-gnu
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-server
Command line
Any command line will work as long as it has loaded a model and is running.
Problem description & steps to reproduce
- Run llama-server on a model
- terminate/shut down the server
First Bad Commit
Not sure, but it is related to here:
llama.cpp/examples/server/server.cpp
Lines 4453 to 4458 in 1a24c46
and
llama.cpp/examples/server/server.cpp
Lines 4502 to 4503 in 1a24c46
It seems the server thread t
is being destructed while still joinable at the time of termination.
This issue can be fixable by uncommenting the t.join();
statements in both places. Opened an issue instead because of the comment.
Relevant log output
srv update_slots: all slots are idle
srv log_server_r: request: POST /v1/chat/completions 127.0.0.1 200
^Csrv operator(): operator(): cleaning up before exit...
terminate called without an active exception
Aborted (core dumped)