-
Notifications
You must be signed in to change notification settings - Fork 9.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lighthouse unable to download robots.txt #10225
Comments
I can replicate this in DevTools and CLI. So not a PSI/LR bug at least. Edit: At a cursory glance it might be because I can audit |
strange, https://developer.mozilla.org/en-US/docs/Web/API/WindowOrWorkerGlobalScope/fetch |
I have the same issue: robots.txt is not valid Lighthouse was unable to download a robots.txt file |
Found the issue for me, CSP policy was set to Header set Content-Security-Policy "default-src 'none'; changing it to Header set Content-Security-Policy "default-src 'self'; fixed it. |
Mozilla Observatory suggests that you deny by default, using That way, you ensure a high(er) score at Mozilla Observatory and you have a passable workaround. BTW, this seems to be related with issue #4386. |
To Reproduce
Steps to reproduce the behavior:
Describe the bug
I'm getting a lowered SEO score when measuring via https://web.dev, with the following issue reported:
However, https://www.wikipedia.org/robots.txt responds without issue.
Perhaps it is getting blocked by something in the middleware, e.g. an Inspector rule of some kind in the way that web.dev/PSI configure Lighthouse? See also #10198 which might be similar.
The text was updated successfully, but these errors were encountered: