Skip to content

Localhosted language model (like llama.cpp) #38059

@anzestrela

Description

@anzestrela

I don't like that you have to pay openai and send them the data to use the new AI smart picker tools.

I think it'd be great if we could use localhosted models like llama.cpp.

Metadata

Metadata

Assignees

No one assigned

    Labels

    0. Needs triagePending check for reproducibility or if it fits our roadmapenhancement

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions