We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
The ollama 0.3 update brought function calling to local models like llama 3.1, mistral v0.3 and many. When are you guys going to adapt it?