Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't connect to the ollama server at ECONNREFUSED #21

Open
gerazov opened this issue Nov 21, 2023 · 7 comments
Open

Can't connect to the ollama server at ECONNREFUSED #21

gerazov opened this issue Nov 21, 2023 · 7 comments

Comments

@gerazov
Copy link

gerazov commented Nov 21, 2023

I set the OLLAMA_HOST like so:

docker run -d -p 3000:3000 -e OLLAMA_HOST="http://127.0.0.1:11434/" --name chatbot-ollama ghcr.io/ivanfioravanti/chatbot-ollama:main

and can't connect the app to the server. docker logs chatbot-ollama reads:

> [email protected] start
> next start

  ▲ Next.js 13.5.4
  - Local:        http://localhost:3000

 ✓ Ready in 236ms
 [TypeError: fetch failed] {
  cause:  [Error: connect ECONNREFUSED 127.0.0.1:11434] {
  errno: -111,
  code: 'ECONNREFUSED',
  syscall: 'connect',
  address: '127.0.0.1',
  port: 11434
}
}
@jowilf
Copy link

jowilf commented Nov 22, 2023

If ollama is running on your host you have to remove -e OLLAMA_HOST="http://127.0.0.1:11434/" or replace with OLLAMA_HOST="http://host.docker.internal:11434/" which is the default value.

@gerazov
Copy link
Author

gerazov commented Nov 22, 2023

I'm not using Docker to server ollama. I've used their default install script which sets it up as a systemd service.

It's also available on http://127.0.0.1:11434/ - it echos Ollama is running if I go to that url in Firefox, I can also use it via other UIs (https://github.com/ollama-webui/ollama-webui) and neovim plugins (https://github.com/nomnivore/ollama.nvim).

@ivanfioravanti
Copy link
Owner

@jowilf is right, 127.0.0.1 is not reachable from within Dcoker, so nextjs app can't reach you Ollama instance running locally.
ollama-webui (great project!) is probably running locally in your machine, same for neovim plugins.
Please test and let us know if it works.

@gerazov
Copy link
Author

gerazov commented Dec 9, 2023

Hmm, but ollama-webui is running within Docker as per their instructions on Installing Ollama Web UI Only 🤔

@VertigoOne1
Copy link

You can use '--network host' as part of the docker run to force the container onto the host network instead of getting isolated, which allows it to access localhost when the systemd ollama is in use.

example

docker run -e OLLAMA_HOST="http://localhost:11434" -p 3000:3000 --network host --name chat ghcr.io/ivanfioravanti/chatbot-ollama:main

is how i'm running it right now. Pretty good!

@gerazov
Copy link
Author

gerazov commented Dec 10, 2023

That's awesome 🤟 maybe this can be added to the README?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants
@ivanfioravanti @gerazov @jowilf @VertigoOne1 and others