-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FR] Add support for structured extraction with Ollama models #68
Comments
There is no official support yet, but you can easily build it yourself with the following guide: https://svilupp.github.io/PromptingTools.jl/dev/how_it_works#Walkthrough-Example-for-aiextract It works well with mixtral and similar models. |
I'm willing to handle this one, relatively straightforward to do. As a clarifying question, what's the difference between a regular schema and a managed one? |
Great! There are different API endpoints in Ollama:
From that perspective, I'd assume you would add aiextract for OllamaSchema which is build around I'd suggest to avoid nested return_types (they are harder for OSS models). Good model to use for aiextract is mixtral if you can run it locally. Does that answer your question? |
It would be good to have
aiextract
enabled for Ollama models.The text was updated successfully, but these errors were encountered: