-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
update ollama 0.3.x support #11709
Comments
hi @przybjul, we can upgrade the version of ipex-llm ollama, and we will notify you immediately after completing the upgrade. |
I wanna know whether the upgrade is completed or not. |
Hi @AlbertXu233 , our current version is still consistent with v0.1.39 of ollama. Do you need the newly supported models or other latest features in ollama version 0.3.x? |
@sgwhat Some new models are only supported by higher versions of Olama |
Hi @rebootcheng, we have now supported models available in higher versions of Ollama, including Llama 3.1, Qwen 2, Gemma 2, Phi 3, and others. |
Hi @sgwhat The deepseek-coder-v2 model doesn't seem to support it |
hi, "tools" seems to be only available in ollama 0.3.x - I would be waiting for that. :) |
I need gemma2 2b and batch embedings api in newly ollama version. |
Hi all, ipex-llm‘s ollama is upgrade to 0.3.6 with |
''ipex-llm[cpp]==2.5.0b20240527 is consistent with [v0.1.34] of ollama.
Our current version is consistent with [v0.1.39] of ollama.''
Is it possible to update supported ollama version to 0.3.x?
The text was updated successfully, but these errors were encountered: