-
Notifications
You must be signed in to change notification settings - Fork 229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't connect to the ollama server at ECONNREFUSED #21
Comments
If ollama is running on your host you have to remove |
I'm not using Docker to server ollama. I've used their default install script which sets it up as a systemd service. It's also available on |
@jowilf is right, 127.0.0.1 is not reachable from within Dcoker, so nextjs app can't reach you Ollama instance running locally. |
Hmm, but ollama-webui is running within Docker as per their instructions on Installing Ollama Web UI Only 🤔 |
You can use '--network host' as part of the docker run to force the container onto the host network instead of getting isolated, which allows it to access localhost when the systemd ollama is in use. example docker run -e OLLAMA_HOST="http://localhost:11434" -p 3000:3000 --network host --name chat ghcr.io/ivanfioravanti/chatbot-ollama:main is how i'm running it right now. Pretty good! |
That's awesome 🤟 maybe this can be added to the README? |
I set the
OLLAMA_HOST
like so:docker run -d -p 3000:3000 -e OLLAMA_HOST="http://127.0.0.1:11434/" --name chatbot-ollama ghcr.io/ivanfioravanti/chatbot-ollama:main
and can't connect the app to the server.
docker logs chatbot-ollama
reads:The text was updated successfully, but these errors were encountered: