-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Amica can't reach basic-openai-api-wrapper #96
Comments
Not sure, by local providers you mean you are running Ollama with on your computer? I am not familiar with LibreChat, can you provide more info about the error you are getting? |
Hey, can you grab a screenshot from developer console in browser? Wondering if CORS issue. |
when you say "link is http", do you mean that Amica is running on HTTP? Both Amica and Ollama must be either on HTTP or HTTPS. You cannot have Amica running HTTPS and try to connect to Ollama with classic HTTP. |
I meant that Ollama was running on HTTP. But I was actually accessing Amica via HTTPS. This is the same setup I use with open-webui, LibreChat and a few others. I now tried accessing Amica also via HTTP but again got that network error. But this time, the developer console actually mentions CORS (as @slowsynapse had suggested earlier): It says that the request was blocked because of the same-origin-rule (reason: CORS headline 'Access-Control-Allow-Origin' missing). Status code 403 |
Oh, did you allow all origins (*) in Ollama? This is required when accessing from another container stack. For Amica, you still need to be on HTTP (or https) for both, but you also need ollama to allow the remote origin. |
Not sure where in Ollama I can set that. But I can say that open-webui and LibreChat (and a few other frontends) all running on the same host (but in different docker stacks) all can access Ollama. So it would seem that Ollama isn't overly picky about the origin of the requests the way it is configured at the moment. (And I didn't want to over complicate things and solve one issue at a time, so I didn't mention it before, but Amica also can't connect to the TTS/STT backend with the openai-wrapper as well that is running directly on the same host (not in docker)). I'm happy to make any changes to my Ollama setup if that helps. I just don't know where... |
If your Amica container cannot access TTS/STT either, I believe the problem comes from your Amica installation, not Ollama. I also run openwebui and I know you wouldn't be able to use it if your Ollama install wasn't already allowing all origins. Are you accessing the app through the local network and where your backends are? |
I can access Amica via HTTPS which goes through a domain on the internet and I can access it via HTTP under its local IP address. The IP address is the same for the other frontends and the Ollama (and TTS/STT) backend. (But I myself am on a different local network.) Amica is able to connect to OpenAI - so the its connectivity can't be totally messed up. |
Tried it again today. All external services seem to be reachable. But none of the local services can be accessed. How can that be? |
Not sure. I think would need your exact setup to see if the issue replicates on another environment. |
I gave this another try today but to no avail. So, my setup is: Amica is running in a Debian 12 VM with rootless docker that runs on a Proxmox host. Ollama is running also in a Debian 12 VM with rootless docker on a Proxmox host. I also have librechat, open-webui, sillytavern, dify, big-AGI and others in the same environment and with comparable setups and all of them are able to contact ollama locally (on http, despite being accessed via https themselves). Based on your comment above that both Amica and Ollama need to be either both on http or both on https, I made Ollama available via an external https proxy, so that both are now accessible via https. But this does not make a difference. I am still getting the network error when prompting Amica. When I switch the chatbot backend to ChatGPT, Amica can connect to it without an error. So what is the difference here? Thanks! |
Hi,
I installed Amica via docker compose and got it working with OpenAI.
But I want to use local providers and so I am trying to connect Amica to my Ollama server. Unlike other Frontends I am running (e.g. LibreChat), Amica can't connect to Ollama with the same info - I keep getting a network error.
What am I missing?
Thanks!
The text was updated successfully, but these errors were encountered: