Closed
Description
Name and Version
$ ./build/bin/llama-cli --version
version: 4275 (6c5bc062)
built with cc (Debian 12.2.0-14) 12.2.0 for x86_64-linux-gnu
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-server
Problem description & steps to reproduce
Problem: I cannot get the loaded model information from the server global properties (GET /props)
How to reproduce:
- Build from commit 6c5bc06 with cmake
git checkout 6c5bc06
cmake -B build
cmake --build build --config Release -j
- Start the llama-server with any model.
- Get the
default_generation_settings.model
value fromhttp://HOST:PORT/props
curl http://HOST:PORT/props -H "Content-Type: application/json" -s | jq -r '.default_generation_settings.model'
Result:
null
Before commit 6c5bc06 this works properly. If you repeat the above with commit 7736837, it will output the correct value for the model. Example:
$ curl http://HOST:PORT/props -H "Content-Type: application/json" -s | jq -r '.default_generation_settings.model'
/models/Llama-3.3-70B-Instruct-GGUF/Llama-3.3-70B-Instruct-Q4_K_M.gguf
First Bad Commit
commit: 6c5bc06
PR: server : (refactoring) do not rely on JSON internally #10643
Relevant log output
No response