-
Notifications
You must be signed in to change notification settings - Fork 617
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FEAT: ChatGLM3 tool calls #701
Conversation
LGTM |
|
What is the error message? Did you test function calling on this branch? |
yes, i merge your branch and test it
it seems tool is a empty slice, not null |
@waltcow how did you call? Using python client or jusr via RESTful api? |
|
I didn't reproduce it, could you check your openai version? it needs to later than v1.0 |
(infer) ➜ inference git:(tool) pip show openai |
@waltcow thanks for your feedback, I've reproduced it, this issue happens with pytorch format, and we will fix it in this PR. |
it works now, thanks |
Support tool calls for ChatGLM3, but the response function arguments are different:
Post URL:
http://localhost:42183/v1/chat/completions
payload json:
ChatGLM3 response (The arguments is a json string):
ChatGLM: https://github.com/THUDM/ChatGLM3/blob/main/tool_using/README.md
ChatGLM cpp: https://github.com/li-plus/chatglm.cpp/blob/main/examples/chatglm3_demo.py
Closes: #676