Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

worker llama-cpp-rpc crash #2609

Closed
JamesClarke7283 opened this issue Jun 20, 2024 · 3 comments · Fixed by #2620
Closed

worker llama-cpp-rpc crash #2609

JamesClarke7283 opened this issue Jun 20, 2024 · 3 comments · Fixed by #2620
Labels
bug Something isn't working confirmed

Comments

@JamesClarke7283
Copy link

JamesClarke7283 commented Jun 20, 2024

LocalAI version:
v2.17.1

Environment, CPU architecture, OS, and Version:
Linux desktop 6.9.5-arch1-1 #1 SMP PREEMPT_DYNAMIC Sun, 16 Jun 2024 19:06:37 +0000 x86_64 GNU/Linux
Archlinux

Describe the bug
The RPC worker crashes

To Reproduce

sudo wget https://github.com/mudler/LocalAI/releases/download/v2.17.1/local-ai-Linux-x86_64 -O /usr/bin/localai
sudo chmod +x /usr/bin/localai
/usr/bin/localai worker llama-cpp-rpc --debug -- 

Expected behavior
Runs without crash.

Logs

➜  ~ /usr/bin/localai worker llama-cpp-rpc --debug -- 

9:45AM DBG Setting logging to debug
9:45AM DBG Extracting backend assets files to /tmp/localai/backend_data
*** stack smashing detected ***: terminated
[1]    8658 IOT instruction (core dumped)  localai worker llama-cpp-rpc --debug --
➜  ~ 

Additional context

@JamesClarke7283 JamesClarke7283 added bug Something isn't working unconfirmed labels Jun 20, 2024
@mudler
Copy link
Owner

mudler commented Jun 21, 2024

@JamesClarke7283 it's only the worker right?

can you confirm if you start the normal HTTP server inference works?

@JamesClarke7283
Copy link
Author

@JamesClarke7283 it's only the worker right?

can you confirm if you start the normal HTTP server inference works?

yes, running localai normally works fine.

@JamesClarke7283
Copy link
Author

Thanks, when will a release happen including this fix?
I prefer the binary builds rather than building myself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working confirmed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants