Audit your robots.txt file
1. Access the Google’s Robots.txt tester using the Google account you use for your Google Search Console and check for warnings and errors.
Google highlights any warnings and errors in the File Editor and displays the number of warnings and errors below it.
2. Use Bing’s Robots.txt tester or third-party tools like TechnicalSEO’s robots.txt Validator and Testing Tool to check for errors with other search engine crawlers.
Google’s robots.txt tester only tests using Google crawlers like Googlebot. Use Bing’s tester or a third-party tool such as that on TechnicalSEO.com for other search engines that you are concerned about.
3. Check the Search Console Coverage Report in your Google Search Console for any errors or warnings.
Be sure to tick the ‘Excluded URLs’ checkbox and check error types to determine whether any page URLs have been blocked by your robots.txt file.
4. Remove any pages that shouldn’t be crawled from your sitemap.
See the Additional Resources for more information on how to do this. Remove noindex pages from your sitemap
5. Fix the issues with your robots.txt file based on the recommendations below:
If there are any errors noticed in the robots.txt testers, reach out to your web developer to fix them. If there are warnings, reach out to your web developer for advice on them. If the error is that the file is blocked by the robots.txt file, but it was submitted for indexing, check to see if the page should be indexed. If it should, take it out of the robots.txt file by deleting the line that looks like this:.disallow: /the-page-url/. Discuss with your web developer before you do this.