Skip to content

Ollama Chat - parse tool calls on streaming #11171

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 16 commits into from
May 27, 2025

Conversation

krrishdholakia
Copy link
Contributor

@krrishdholakia krrishdholakia commented May 27, 2025

  • fix(user_api_key_auth.py): fix else block

Fixes #11170

  • refactor(ollama/chat): refactor to base config pattern

easier to maintain fixes

  • fix(ollama/chat): support tool call parsing on streaming

Closes #11104

Copy link

vercel bot commented May 27, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 27, 2025 11:15pm

@lsorber
Copy link

lsorber commented May 27, 2025

@krrishdholakia Thanks for working on this! May I ask to also verify that litellm.stream_chunk_builder successfully recovers the tool calls from the chunks? Because in debugging my unit test for this issue, I found that stream_chunk_builder does not include the tool calls because Ollama does not return tool call ids as part of its response.

@krrishdholakia
Copy link
Contributor Author

Hey @lsorber this is solved by handling the tool call translation in the ollama/chat.py - this follows are newer pattern of keeping all the llm translation work inside the llm folder.

This then returns a standard ModelResponseStream object which is returned by the stream chunk builder

@krrishdholakia krrishdholakia merged commit 4c82dd9 into main May 27, 2025
8 of 45 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: Warning - Key=sh-xxx is not a string. [Bug]: Ollama models are unable to do tool calling via LiteLLM
2 participants