-
Notifications
You must be signed in to change notification settings - Fork 9.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Empty robots.txt is reported as not valid #9975
Comments
An empty robots.txt is equivalent to a missing one, as far as crawlers are concerned. However, it's tough to know for sure the intent behind an empty I think we continue failing this case, but should have a better error message, and suggest a robots.txt that explicitly allows all crawling.
|
If robots.txt is not there or if the content is empty, the audit doesn't fail. The case passes due to this piece of code: https://github.com/GoogleChrome/lighthouse/blob/v5.6.0/lighthouse-core/audits/seo/robots-txt.js#L218-L223 The RobotsTxt audit failed for https://plurrrr.com/ because of the Content Security Policy which does not let fetch calls. Screenshot of what happened in case of https://plurrrr.com/ |
It shouldn't be loaded via fetch. The robots.txt should be treated separate from CSPs. It's not content, so CSP should not apply. It's not like the browser is trying to display the contents as part of a document. CSP only affects loading files as referenced from directly navigated files, like HTML src attributes, etc. The robots.txt call must be made separate from the page, like using a new tab for it. The title for this issue should be renamed. |
This is actually the same root issue as #4386 which is much broader and applies to many areas of Lighthouse. We'll de-dupe into there. |
I'm also facing this problem. Is this problem can be an obstacler to crawling and index? Even, my website speed is 95up for both. My website- Best Tech Club |
When a robots.txt of 0 bytes is created, e.g.
touch robots.txt
it's reported asSite example: https://plurrrr.com/
The text was updated successfully, but these errors were encountered: