Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Misc. bug: llama-server throws "Unsupported param: tools" #10920

Open
hsm207 opened this issue Dec 20, 2024 · 1 comment
Open

Misc. bug: llama-server throws "Unsupported param: tools" #10920

hsm207 opened this issue Dec 20, 2024 · 1 comment

Comments

@hsm207
Copy link

hsm207 commented Dec 20, 2024

Name and Version

version: 4369 (21ae3b9)
built with Apple clang version 15.0.0 (clang-1500.3.9.4) for arm64-apple-darwin23.6.0

Operating systems

Mac

Which llama.cpp modules do you know to be affected?

llama-server

Problem description & steps to reproduce

llama-server throws:

{"code":500,"message":"Unsupported param: tools","type":"server_error"}

if the request has "tools" parameter.

To reproduce:

  1. Start llama-server e.g.

./llama-server \
-m andyywl_Qwen2.5-1.5B-Instruct.Q6_K.gguf_Qwen2.5-1.5B-Instruct.Q6_K.gguf
--port 8080

  1. Make a request to the server e.g.

curl http://localhost:8080/v1/chat/completions
-H "Content-Type: application/json"
-H "Authorization: Bearer NOT_APPLICABLE_FOR_LOCAL_MODELS"
-X POST
-d '{
"messages": [
{
"content": "You are a helpful AI assistant. Solve tasks using your tools. Reply with TERMINATE when the task has been completed.",
"role": "system"
},
{
"content": "What is the weather in New York?",
"role": "user",
"name": "user"
}
],
"model": null,
"stream": false,
"tools": [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "",
"parameters": {
"type": "object",
"properties": {
"city": {
"description": "city",
"title": "City",
"type": "string"
}
},
"required": [
"city"
]
}
}
}
]
}'

First Bad Commit

No response

Relevant log output

srv          init: initializing slots, n_slots = 1
slot         init: id  0 | task -1 | new slot n_ctx_slot = 4096
main: model loaded
main: chat template, built_in: 1, chat_example: '<|im_start|>system
You are a helpful assistant<|im_end|>
<|im_start|>user
Hello<|im_end|>
<|im_start|>assistant
Hi there<|im_end|>
<|im_start|>user
How are you?<|im_end|>
<|im_start|>assistant
'
main: server is listening on http://127.0.0.1:8080 - starting the main loop
srv  update_slots: all slots are idle
got exception: {"code":500,"message":"Unsupported param: tools","type":"server_error"}
request: POST /v1/chat/completions 127.0.0.1 500
@okigan
Copy link
Contributor

okigan commented Dec 25, 2024

Running into same issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants