-
-
Notifications
You must be signed in to change notification settings - Fork 10.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
✨ feat(llm): support Ollama AI Provider (local llm) #1265
Conversation
Someone is attempting to deploy a commit to the LobeHub Team on Vercel. A member of the Team first needs to authorize it. |
👍 @sjy Thank you for raising your pull request and contributing to our Community |
9f39bc6
to
11956ab
Compare
@sjy A great PR! 👏👏 I think there is two issue we should discuss:
|
I have tried with my MacBook Pro, the speed is really amazing. cc @cocobeach Ollama.mp4 |
|
That is truly impressive, can't wait to test it, I only have a i9 8 cores and 32gb or ram so I doubt it will run that fast, I can run Solar reasonably well, I find the open source models less "lazy" than gpt4 that seems unbothered to write code without placeholders. What spec is your MacBook, it is indeed mighty fast? |
I think this PR doesn't need to consider #1257, I will do it later. This PR just need to focus on the Ollama Provider. Beside the visible control, I think there should be another field for the custome models in Ollama. It may be the same as OpenAI's custom models. So then users can add their own local models. As for the ollama sdk, I prefer your current implement after investigating the ollama js sdk, it current don't support browser by now. So we can't use it when working on #1257. Therefore, I think it's better to just use OpenAI sdk. |
My Macbook Pro is a M1 Max 16", 64GB RAM. |
It's is impressive in speed, the x86 is a bit slower. |
Looking forward to this new feature to be able to use Ollama local models. |
|
ae16363
to
2496fa5
Compare
@arvinxx, updated, BTW, Qwen model is added to the default list when enable Ollam, please help review the changes ~ |
@sjy It's great 👍. And let me take over the rest task~ |
❤️ Great PR @sjy ❤️ The growth of project is inseparable from user feedback and contribution, thanks for your contribution! If you are interesting with the lobehub developer community, please join our discord and then dm @arvinxx or @canisminor1990. They will invite you to our private developer channel. We are talking about the lobe-chat development or sharing ai newsletter around the world. |
💻 变更类型 | Change Type
🔀 变更说明 | Description of Change
📝 补充信息 | Additional Information
Refs #1283