-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ollama support? #1001
Comments
@txhno Sorry about that random weird comment...removed your reply too since it had a quote of the link in it, hope that's OK! On topic -- exploring Ollama support is a really good idea. My understanding is that they just use We can put this on our backlog to investigate, but if you (or anyone reading this!) have some knowledge about how Ollama works, I'd be happy to tag-team and support a PR here. @riedgar-ms @nking-1 for awareness |
Hi! I’ve implemented a thin wrapper for Ollama support in my fork. Can you give it a shot before I submit a PR? Thanks! |
Is your feature request related to a problem? Please describe.
would want to reuse the models that I already have downloaded on ollama
Describe the solution you'd like
being able to use models.ollama(model_name_or_path)
Describe alternatives you've considered
llama cpp works as of now, but ollama would just make process of using this app a lot more user friendly having downloads automated and models stored centrally
Additional context
none
The text was updated successfully, but these errors were encountered: