Skip to content

Model fails to load with RCP error after clean install #5314

Open
@tescophil

Description

@tescophil

LocalAI version:
LocalAI version: v2.28.0 (56f44d4)

Environment, CPU architecture, OS, and Version:
Linux desktop-garage 4.19.0-12-amd64 #1 SMP Debian 4.19.152-1 (2020-10-18) x86_64 GNU/Linux
Intel i3, 8Gb RAM

Describe the bug
A clean installation fails to run any model, producing the error code

[llama-cpp] Fails: failed to load model with internal loader: could not load model: rpc error: code = Unavailable desc = error reading from server: EOF

To Reproduce
Install, run, download gemma-3-1b-it (tried a a number of other modes with the same result), open chat and ask a question

Expected behavior
I expect the model to load

Logs
Debug log attached

Additional context
If you don't have this specific problem then please don't hijack my issue 🙏

local-ai.log

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions