Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adds robots.txt per search.gov recs #682

Merged
merged 1 commit into from
Aug 27, 2024
Merged

adds robots.txt per search.gov recs #682

merged 1 commit into from
Aug 27, 2024

Conversation

jduss4
Copy link
Contributor

@jduss4 jduss4 commented Aug 26, 2024

Closes #243

Changes proposed in this pull request:

  • adds a robots.txt file to the guides repo
  • allows all robots, specifies that search.gov can index a page every 2 seconds

Questions:

  • are there any agents that we want to address specifically?
  • are there any pages that we don't want crawled?

security considerations

No concerns.

@jduss4 jduss4 requested a review from a team as a code owner August 26, 2024 19:13
Copy link
Contributor

@jskinne3 jskinne3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a little surprised to learn we didn't already have a robots.txt file!

@jduss4 jduss4 merged commit ff0a6a9 into main Aug 27, 2024
8 checks passed
@jduss4 jduss4 deleted the robots-txt branch August 27, 2024 20:12
@jduss4 jduss4 mentioned this pull request Aug 27, 2024
11 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Request]: Add a robots.txt file once guides are fully migrated
2 participants