Skip to content

Add support for using WireAPI:Responses with Ollama #8408

@scaryrawr

Description

@scaryrawr

What feature would you like to see?

There's a warning when using Ollama about moving to the responses API. for Ollama it's hard coded to use Chat. Since not everyone may have their Ollama up-to-date it would be nice to have an experimental flag to enable responses.

Additional information

Ollama recently added responses support https://docs.ollama.com/api/openai-compatibility#%2Fv1%2Fresponses

Metadata

Metadata

Assignees

No one assigned

    Labels

    custom-modelIssues related to custom model providers (including local models)enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions