Skip to content

Portable Builds Upstream Version Update Request #13184

Open
@bmh129

Description

@bmh129

The portable build for Ollama with IPEX LLM built in, including the latest weekly build, needs to be based on a newer upstream Ollama release to support the latest LLMs from Google and Meta, etc. Please consider building based on a newer upstream Ollama version.

C:\Users\a_user\Portable\ollama-ipex-llm-2.3.0b20250429-win>ollama pull gemma3:1b
pulling manifest
pulling 7cd4618c1faf... 100% ▕████████████████████████████████████████████████████████▏ 815 MB
pulling e0a42594d802... 100% ▕████████████████████████████████████████████████████████▏ 358 B
pulling dd084c7d92a3... 100% ▕████████████████████████████████████████████████████████▏ 8.4 KB
pulling 3116c5225075... 100% ▕████████████████████████████████████████████████████████▏ 77 B
pulling 120007c81bf8... 100% ▕████████████████████████████████████████████████████████▏ 492 B
verifying sha256 digest
writing manifest
success

C:\Users\a_user\Portable\ollama-ipex-llm-2.3.0b20250429-win>ollama run gemma3:1b
Error: llama runner process has terminated: this model is not supported by your version of Ollama. You may need to upgrade

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions