Replies: 2 comments
-
@wencan you are using the All-In-One images that comes with pre-configured models. If you don't need pre-configured models, use the standard images: https://localai.io/basics/container/ |
Beta Was this translation helpful? Give feedback.
-
@mudler When I run the Docker image localai/localai:latest, which has the hash 47933b52bde3, it first outputs "===> LocalAI All-in-One (AIO) container starting..." and then starts downloading models. This indicates that it is an AIO image, not a standard image as stated in the documentation. When I mount a local directory to an AIO container, the container automatically creates some files in that directory. When I next mount this directory to a standard container, the standard container will continue the AIO container's behavior. Unless I manually delete the files automatically created by the AIO container. |
Beta Was this translation helpful? Give feedback.
-
As a first-time user, I find the documentation and behavior of LocalAI to be very perplexing. I am unsure whether I have properly set up and running LocalAI.
I followed the instructions to launch LocalAI using the provided command:
The models directory contains only a single gguf file.
Upon launching LocalAI, it randomly downloads a model. Since I am located in mainland China, this download is bound to fail.
LocalAI then displays the listening address. I accessed the endpoint page, but the content is completely unrelated to my needs. The specified model is not found on the page, and the chat function is also unusable.
Using curl http://localhost:8080/v1/models, I obtained the following output:
I am unsure of the origin of the listed models except for the last one.
Finally, I attempted to initiate a conversation with the model using curl:
Regardless of the prompt or query, the response remains consistently the same:
Beta Was this translation helpful? Give feedback.
All reactions