Skip to content

Model fails to load with RCP error after clean install #5314

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
tescophil opened this issue May 4, 2025 · 0 comments
Open

Model fails to load with RCP error after clean install #5314

tescophil opened this issue May 4, 2025 · 0 comments
Labels
bug Something isn't working unconfirmed

Comments

@tescophil
Copy link

LocalAI version:
LocalAI version: v2.28.0 (56f44d4)

Environment, CPU architecture, OS, and Version:
Linux desktop-garage 4.19.0-12-amd64 #1 SMP Debian 4.19.152-1 (2020-10-18) x86_64 GNU/Linux
Intel i3, 8Gb RAM

Describe the bug
A clean installation fails to run any model, producing the error code

[llama-cpp] Fails: failed to load model with internal loader: could not load model: rpc error: code = Unavailable desc = error reading from server: EOF

To Reproduce
Install, run, download gemma-3-1b-it (tried a a number of other modes with the same result), open chat and ask a question

Expected behavior
I expect the model to load

Logs
Debug log attached

Additional context
If you don't have this specific problem then please don't hijack my issue 🙏

local-ai.log

@tescophil tescophil added bug Something isn't working unconfirmed labels May 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working unconfirmed
Projects
None yet
Development

No branches or pull requests

1 participant