Google has released a new robots.txt report within Google Search Console. Google also made relevant information around robots.txt available from within the Page indexing report in Search Console. Finally, Google has decided to sunset the robots.txt tester.
The new robots.txt report. Google’s new robots.txt report shows gives you some information Google has on your robots.txt file including:
- Which robots.txt files Google found for the top 20 hosts on your site
- The last time Google crawled those files,
- Plus any warnings or errors encountered.
Google also added to this report the ability request a recrawl of a robots.txt file for emergency situations, Google said.
What it looks like. You can access this report within Google Search Console under settings, here is what it looks like:
More. Google also made relevant information around robots.txt available from within the Page indexing report in Search Console.
With this new robots.txt report, Google has decided to sunset the robots.txt tester.
More help. Google has a more detailed help document on this robots.txt report over here.
Why we care. If you are having indexing and crawling issues, this report may give you more insight into if the issue is related to your robots.txt file.
You should probably review this report for the sites you manage within Google Search Console, just to ensure there are no issues with Google being able to access your site due to robots.txt directives.