Skip to content

[Bug]: Ollama models are unable to do tool calling via LiteLLM #11104

Closed
@ericmjl

Description

@ericmjl

What happened?

A bug happened!

@krrishdholakia and @ishaan-jaff I think I may have found a bug with Ollama models and tool calling, for which I am not quite sure how to fix this.

To begin, I have a minimal reproducible example here: https://gist.github.com/ericmjl/289be4c6b46adb525175c4b1db2f97f1

I tried using 3 Ollama models that are known to be capable of tool calling.

When I use the Completions API to call out to ollama_chat/ models, I find that I am unable to get a tool call response from the models. However, when I switch to using the Ollama API directly, almost always consistently I am able to get a tool call response.

I'm not sure what's happening here, might there be something in the translation layer to Ollama?

Relevant log output

To view the notebook, run:

uvx marimo edit https://gist.githubusercontent.com/ericmjl/289be4c6b46adb525175c4b1db2f97f1/raw/c10288114b3b3d965c5d5f8d28770d5da0b83b52/reprex_ollama_tools_failing.py

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.70.4

Twitter / LinkedIn details

No response

Metadata

Metadata

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions