Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Synology Docker Install Fine; Setup Per Instructions; Won't Connect to Scraping APIs #104

Open
Seventy-9DegreesNorth opened this issue Jun 19, 2023 · 4 comments

Comments

@Seventy-9DegreesNorth
Copy link

Hi there, I installed SerpBear on Synology per https://mariushosting.com/how-to-install-serpbear-on-your-synology-nas/. Deployed via Portainer with the specified Docker compose. I used the specified Synology settings, kept port 3000, but the one thing I did differently was to setup the synology reverse proxy using a custom domain and a Let'sEncrypt certificate. I signed up and tried both ScrapingRobot and ScrapingAnt APIs according to SerpBear documentation. Entered APIs as outlined in documentation. Updated settings, Restarted stacks in Portainer. Login to SerpBear without issue. All seems to be working fine. I can enter keywords and can click to update serps but no data is fetched. I know some of the keywords we rank in top 10 (and a few higher) but all show no data and just "> 100"

Checked docker logs on Synology and get this (since this is public, in this past of the log I used dummy keywords and changed the domain to mydomain.com):

[0] syscall: 'getaddrinfo',
[0] code: 'EAI_AGAIN',
[0] errno: -3001,
[0] at GetAddrInfoReqWrap.onlookup [as oncomplete] (node:dns:107:26) {
[0] cause: Error: getaddrinfo EAI_AGAIN api.scrapingant.com
[0] at async refreshAndUpdateKeywords (/app/.next/server/chunks/816.js:110:23) {
[0] at async refreshKeywords (/app/.next/server/chunks/816.js:232:28)
[0] at async Promise.all (index 0)
[0] at async scrapeKeywordFromGoogle (/app/.next/server/chunks/816.js:378:87)
[0] at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
[0] at Object.fetch (node:internal/deps/undici/undici:11413:11)
[0] [ERROR_MESSAGE]: TypeError: fetch failed
[0] [ERROR] Scraping Keyword : keyword #2 . Error: undefined
[0] GET /api/keywords?domain=mydomain-com
[0] POST /api/keywords
[0] GET /api/keywords?domain=mydomain-com
[0] [SUCCESS] Updating the Keyword: Keyword 3
[0] }
[0] path: '/app/data/failed_queue.json'
[0] syscall: 'open',
[0] code: 'ENOENT',
[0] errno: -2,
[0] [Error: ENOENT: no such file or directory, open '/app/data/failed_queue.json'] {
[0] time taken: 5168.428608059883ms
[0] ALL DONE!!!
[0] }
[0] }
[0] hostname: 'api.scrapingant.com'
[0] syscall: 'getaddrinfo',
[0] code: 'EAI_AGAIN',
[0] errno: -3001,
[0] at GetAddrInfoReqWrap.onlookup [as oncomplete] (node:dns:107:26) {
[0] cause: Error: getaddrinfo EAI_AGAIN api.scrapingant.com
[0] at async refreshAndUpdateKeywords (/app/.next/server/chunks/816.js:110:23) {
[0] at async refreshKeywords (/app/.next/server/chunks/816.js:232:28)
[0] at async Promise.all (index 0)
[0] at async scrapeKeywordFromGoogle (/app/.next/server/chunks/816.js:378:87)
[0] at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
[0] at Object.fetch (node:internal/deps/undici/undici:11413:11)
[0] [ERROR_MESSAGE]: TypeError: fetch failed
[0] [ERROR] Scraping Keyword : Test Keyword . Error: undefined
[0] GET /api/keywords?domain=mydomain-com
[0] POST /api/keywords
[0] GET /api/keywords?domain=mydomain-com
[0] GET /api/settings
[0] domains: 1
[0] GET /api/domains?withstats=true
[0] POST /api/domains
[0] GET /api/settings
[0] domains: 0
[0] GET /api/domains?withstats=true

Did some searching and it appears that the code "EAI_AGAIN" may be linked to a DNS fail? I can't understand what that would be. Any other ideas on why it appears I can't connect with EITHER scraping API site?

I checked the data files and they are being written -- but there's nothing in them -- I assume because it's not connecting to pull data.

Thoughts are welcome.

@tsgoldladder
Copy link

Experiencing same, setup on Pikapods using ScrapingRobot.

Sometimes this EAI_AGAIN (getaddrinfo) issue happens.

But I am getting some successes, so maybe this is intermitten?

This is more regular: 1/3 keywords throwing this error in console:
Error: undefined issue:


[0] START SCRAPE:  [tsgoldladder's keyword]
[0] RROR] Scraping Keyword :  [tsgoldladder's keyword] . Error:  undefined

[0] RROR_MESSAGE]:  Error: bject Object]

[0]     at scrapeKeywordFromGoogle (/app/.next/server/chunks/816.js:396:19)

[0]     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

[0]     at async refreshKeywords (/app/.next/server/chunks/816.js:236:42)

[0]     at async refreshAndUpdateKeywords (/app/.next/server/chunks/816.js:110:23)

@towfiqi
Copy link
Owner

towfiqi commented Sep 21, 2023

@Seventy-9DegreesNorth ScrapingAnt Disabled Scraping Google for free users a long ago. You will need to use an alternative scraper. Choose another scraper from this list: https://docs.serpbear.com/integrations

@towfiqi
Copy link
Owner

towfiqi commented Sep 21, 2023

@tsgoldladder Does this happen when you click the "Reload All Serps" button?

@tsgoldladder
Copy link

Yes it does but also when selecting multiple keywords from the list and choosing "Refresh Keywords".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants