-
-
Notifications
You must be signed in to change notification settings - Fork 195
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to automatically obtain labels #327
Comments
I just installed this last night, and I'm having the same issue. When I add a link directly, it is unable to fetch anything (no metadata other than URL). When I try to add a bookmark via CLI, it says: Error:
The web container shows a timeout, but not sure if it's downstream of chrome or worker:
Hopefully some of this helps. |
@hongruilin to be able to help, we'll need to see the logs of your worker container. Also, if you're planning to just use openai, you don't need to set the base url. |
@djl0 we need the logs from the worker container as well |
@djl0 if your worker container is not able to talk to the chrome container then this is your problem. The worker container is the one that schedules the crawling requests on the chrome container. If they can't talk, no crawling will happen. Your issue seems to be different than that of @hongruilin, you might want to open a separate issue |
@MohamedBassem Thanks for taking a look so quickly. Here is a snippet of the worker log. If you confirm that it can't talk to chrome container, I will make a separate issue. Note that there were many timeout errors like the one at the top of this.
|
@djl0 hmmm, no this doesn't seem like a chrome problem. It seems like a dns/connectivity problem. This container doesn't seem to be able to talk to the internet for some reason. It's failing to resolve dns, and sometimes timesout |
@MohamedBassem I've been playing around with it (eg trying newer chrome image, adding explicit dns server to docker). Not sure how much of that did anything, but these are my errors currently, and they don't appear to me to be dns related anymore. In fact, the dbus item is something i see in other troubleshooting discussions for other other projects using that image see here. I tried to workers (many of these errors):
chrome (entire contents of log):
web (many of these errors):
I appreciate any insight. |
the timeouts you're getting are redis timeouts (which is used as the job queue). Is your redis container healthy? |
this is the redis log (which seemed healthy to me?). And docker ps didn't show any errors (though i wouldn't necessarily expect it to)
|
@hongruilin sorry to take over your Issue. Curious to know if your logs show something similar to mine. |
Hello, I tried not adding OPENAI_BASE_URL=https://api.openai.com to the .env file, so the tag can be automatically retrieved, and everything works fine. Could you please tell me if setting the variable OPENAI_BASE_URL=https://api.openai.com is correct? If my region requires using Azure OpenAI or another intermediary that is compatible with the OpenAI interface, what should I do? Thank you very much for taking the time to respond to my question. |
@hongruilin your problem is that open AI's base URL is |
@djl0 can you try running the version |
In the case of using openai, I only need to set OPENAI_API_KEY to get the label. In my guess, if I still set OPENAI_BASE_URL to the official api request connection of openai, it should also automatically get the label. But it failed. I still hope to add the official api url of openai to the OPENAI_BASE_URL variable. So how should the OPENAI_BASE_URL parameter be set?https://api.openai.com/v1 or https://api.openai.com/? |
I think it should be |
Thank you very much https://api.openai.com/v1 It's correct. You can obtain the label and everything is normal |
This is my. env configuration file. Not only can't it retrieve tags, but some English websites also can't retrieve images. Even after changing servers in several countries, it still doesn't work. If you see my post, can you send me the. env content of the demo website
The text was updated successfully, but these errors were encountered: