Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There is something wrong with VLM #2668

Open
techResearcher2021 opened this issue Jun 27, 2024 · 3 comments
Open

There is something wrong with VLM #2668

techResearcher2021 opened this issue Jun 27, 2024 · 3 comments
Labels
bug Something isn't working unconfirmed

Comments

@techResearcher2021
Copy link

LocalAI version:

using docker image: latest-aio-gpu-nvidia-cuda-12

Environment, CPU architecture, OS, and Version:

Linux 0fe2bf31da79 5.15.133.1-microsoft-standard-WSL2 #1 SMP Thu Oct 5 21:02:42 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux

Describe the bug

VL Model llava-v1.6-mistral-7b.Q5_K_M does not work correctly.

To Reproduce

I just run the docker container and chat by http://localhost:8080/chat/gpt-4-vision-preview. I upload an image and ask 'What the image describes ?

Expected behavior

The model runs inference and response the question.

Logs

2024-06-27 14:26:30 6:26AM INF LocalAI version: v2.17.1 (8142bdc)
2024-06-27 14:26:30 WARNING: failed to determine nodes: open /sys/devices/system/node: no such file or directory
2024-06-27 14:26:30 WARNING: failed to read int from file: open /sys/class/drm/card0/device/numa_node: no such file or directory
2024-06-27 14:26:30 WARNING: failed to determine nodes: open /sys/devices/system/node: no such file or directory
2024-06-27 14:26:30 WARNING: error parsing the pci address "vgem"

....

2024-06-27 14:27:27 6:27AM INF Loading model 'llava-v1.6-mistral-7b.Q5_K_M.gguf' with backend llama-cpp
2024-06-27 14:27:27 WARNING: failed to determine nodes: open /sys/devices/system/node: no such file or directory
2024-06-27 14:27:27 WARNING: failed to read int from file: open /sys/class/drm/card0/device/numa_node: no such file or directory
2024-06-27 14:27:27 WARNING: failed to determine nodes: open /sys/devices/system/node: no such file or directory
2024-06-27 14:27:27 6:27AM INF Success ip=172.17.0.1 latency=1.21655ms method=POST status=200 url=/v1/chat/completions
2024-06-27 14:27:27 6:27AM INF Loading model 'llava-v1.6-mistral-7b.Q5_K_M.gguf' with backend llama-cpp
2024-06-27 14:27:27 WARNING: error parsing the pci address "vgem"
2024-06-27 14:27:27 6:27AM INF [llama-cpp] attempting to load with AVX2 variant
2024-06-27 14:27:29 WARNING: failed to determine nodes: open /sys/devices/system/node: no such file or directory
2024-06-27 14:27:29 WARNING: failed to read int from file: open /sys/class/drm/card0/device/numa_node: no such file or directory
2024-06-27 14:27:29 WARNING: failed to determine nodes: open /sys/devices/system/node: no such file or directory
2024-06-27 14:27:29 WARNING: error parsing the pci address "vgem"
2024-06-27 14:27:29 6:27AM INF [llama-cpp] attempting to load with AVX2 variant

Additional context

@techResearcher2021 techResearcher2021 added bug Something isn't working unconfirmed labels Jun 27, 2024
@pedroresende
Copy link

any news on this ?

@mhaustria2
Copy link

I am getting the same error, but only since today. Worked like a charm the last 4 days. Any idea what the cause could be?

@gklank
Copy link

gklank commented Oct 15, 2024

Hi,

I am using image: localai/localai:v2.22.0-aio-gpu-nvidia-cuda-11 and I have the same issue!

Any ideas for help?

Regards

Gerhard

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working unconfirmed
Projects
None yet
Development

No branches or pull requests

4 participants