Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Amica can't reach basic-openai-api-wrapper #96

Open
gitwittidbit opened this issue Apr 23, 2024 · 13 comments
Open

Amica can't reach basic-openai-api-wrapper #96

gitwittidbit opened this issue Apr 23, 2024 · 13 comments

Comments

@gitwittidbit
Copy link

Hi,
I installed Amica via docker compose and got it working with OpenAI.
But I want to use local providers and so I am trying to connect Amica to my Ollama server. Unlike other Frontends I am running (e.g. LibreChat), Amica can't connect to Ollama with the same info - I keep getting a network error.
What am I missing?
Thanks!

@slowsynapse
Copy link
Collaborator

Not sure, by local providers you mean you are running Ollama with on your computer? I am not familiar with LibreChat, can you provide more info about the error you are getting?

@gitwittidbit
Copy link
Author

Yes, I have Ollama running on my local server (but in a different docker stack). Another frontend that works with Ollama is Open-Webui.
So when I chat with Amica (and have Ollama configured as backend), I am getting this:

Screenshot 2024-04-25 at 11 00 02

@slowsynapse
Copy link
Collaborator

Hey, can you grab a screenshot from developer console in browser? Wondering if CORS issue.

@gitwittidbit
Copy link
Author

Okay, so it says that the loading of mixed content was blocked:

Screenshot 2024-04-25 at 13 18 50

Which is strange to me, because the link is http.

@napiquet
Copy link

when you say "link is http", do you mean that Amica is running on HTTP?

Both Amica and Ollama must be either on HTTP or HTTPS. You cannot have Amica running HTTPS and try to connect to Ollama with classic HTTP.
You would either need to change Ollama to run securely as well, or downgrade Amica to non secure.

@gitwittidbit
Copy link
Author

I meant that Ollama was running on HTTP.

But I was actually accessing Amica via HTTPS. This is the same setup I use with open-webui, LibreChat and a few others.

I now tried accessing Amica also via HTTP but again got that network error. But this time, the developer console actually mentions CORS (as @slowsynapse had suggested earlier):

Screenshot 2024-04-25 at 14 57 49

It says that the request was blocked because of the same-origin-rule (reason: CORS headline 'Access-Control-Allow-Origin' missing). Status code 403

@napiquet
Copy link

Oh, did you allow all origins (*) in Ollama? This is required when accessing from another container stack.

For Amica, you still need to be on HTTP (or https) for both, but you also need ollama to allow the remote origin.

@gitwittidbit
Copy link
Author

Not sure where in Ollama I can set that.

But I can say that open-webui and LibreChat (and a few other frontends) all running on the same host (but in different docker stacks) all can access Ollama. So it would seem that Ollama isn't overly picky about the origin of the requests the way it is configured at the moment.

(And I didn't want to over complicate things and solve one issue at a time, so I didn't mention it before, but Amica also can't connect to the TTS/STT backend with the openai-wrapper as well that is running directly on the same host (not in docker)).

I'm happy to make any changes to my Ollama setup if that helps. I just don't know where...

@napiquet
Copy link

If your Amica container cannot access TTS/STT either, I believe the problem comes from your Amica installation, not Ollama. I also run openwebui and I know you wouldn't be able to use it if your Ollama install wasn't already allowing all origins.

Are you accessing the app through the local network and where your backends are?
I personally didn't manage to use Ollama when accessing Amica through a public domain, I had to access the container directly with its IP on my network.

@gitwittidbit
Copy link
Author

gitwittidbit commented Apr 25, 2024

I can access Amica via HTTPS which goes through a domain on the internet and I can access it via HTTP under its local IP address. The IP address is the same for the other frontends and the Ollama (and TTS/STT) backend. (But I myself am on a different local network.)

Amica is able to connect to OpenAI - so the its connectivity can't be totally messed up.

@gitwittidbit
Copy link
Author

Tried it again today.

All external services seem to be reachable. But none of the local services can be accessed. How can that be?

@slowsynapse
Copy link
Collaborator

Not sure. I think would need your exact setup to see if the issue replicates on another environment.

@gitwittidbit
Copy link
Author

I gave this another try today but to no avail.

So, my setup is: Amica is running in a Debian 12 VM with rootless docker that runs on a Proxmox host. Ollama is running also in a Debian 12 VM with rootless docker on a Proxmox host. I also have librechat, open-webui, sillytavern, dify, big-AGI and others in the same environment and with comparable setups and all of them are able to contact ollama locally (on http, despite being accessed via https themselves).

Based on your comment above that both Amica and Ollama need to be either both on http or both on https, I made Ollama available via an external https proxy, so that both are now accessible via https. But this does not make a difference. I am still getting the network error when prompting Amica.

When I switch the chatbot backend to ChatGPT, Amica can connect to it without an error. So what is the difference here?

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants