-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
rpc error: code = ResourceExhausted desc = grpc: received message larger than max (400000002 vs. 4194304) #1150
Comments
I'm experiencing the same issue: trying to use embedding with a llama model and llama backend. I'd be keen to hear about any possible solutions. |
|
I experienced the same issue: |
yourtiger seems to have found a valid solution. Can someone propose a permanent Pull/Merge Request to fix the issue? |
ran into this issue today when sending a lot if images [img-5][img-4][img-3][img-2][img-1][img-0]What are in these images? Is there any difference between them?<|eot_id|> 11:32AM DBG Prompt (before templating): <|start_header_id|>user<|end_header_id|> [img-5][img-4][img-3][img-2][img-1][img-0]What are in these images? Is there any difference between them?<|eot_id|> 11:32AM DBG Template found, input modified to: <|start_header_id|>user<|end_header_id|> [img-5][img-4][img-3][img-2][img-1][img-0]What are in these images? Is there any difference between them?<|eot_id|> <|start_header_id|>assistant<|end_header_id|> 11:32AM DBG Prompt (after templating): <|start_header_id|>user<|end_header_id|> [img-5][img-4][img-3][img-2][img-1][img-0]What are in these images? Is there any difference between them?<|eot_id|> <|start_header_id|>assistant<|end_header_id|> 11:32AM DBG Model already loaded in memory: llava-llama-3-8b-v1_1-int4.gguf |
LocalAI version:
quay.io/go-skynet/local-ai:latest
Environment, CPU architecture, OS, and Version:
Linux localhost.localdomain 3.10.0-1160.99.1.el7.x86_64 #1 SMP Wed Sep 13 14:19:20 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Describe the bug
rpc error: code = ResourceExhausted desc = grpc: received message larger than max (400000002 vs. 4194304)
I am using https://github.com/ymcui/Chinese-LLaMA-Alpaca Download the Chinese Alpaca-13B model and convert it to ggml model ggml-model-q4_0.gguf file and added llama. yaml file in the models directory folder, with the following content:
I started it in DEBGU mode
Using Postman, send the post command as shown in the following figure
The log output of LocalAI is as follows
What should I do,please
The text was updated successfully, but these errors were encountered: