What is Robots.txt Testing Tool?
The robots.txt tester tool provides you detailed information whether your current robots.txt file is blocking Google search crawlers from accessing any specific URLs on your site. To make it simpler, you can use this tool to test whether Google bot crawlers can crawl the URL of a page that you wish to block from Google search engine.How to Test your Robots.txt File with Robots.txt Testing Tool:
The very first thing you need to do is to login to Google Webmaster tool, then go to Robots.txt tester and from the list of your verified properties select the one which you would like to test.
Now you will see your current robots.text file, you can test different
URLs to see if google crawlers are disallowed from crawling them or not.
Type a URL in the text box present at the bottom of the page and press Test button.
Here the test button will either change to "ACCEPTED" or "BLOCKED", it depends whether the URL you enter has blocked the access of Google crawlers or not.
Make changes to the robots.txt file according to your needs and retest
the file as much as needed until you are satisfied. Once you are done
customizing and have finished writing new rules to the file, copy the
whole code and paste it on your robots.txt file hosted on your site.
Note: This tool does not changes
your robots.txt file, so you have to upload a fresh file on your own.
This tool only tests the against the copy hosted in the tool
We have already written a tutorial on how to edit a Robots.txt file in blogger so please take a look at it if you don't know how to make changes in robots.txt file.
Dear readers, after reading the Content please ask for advice and to provide constructive feedback Please Write Relevant Comment with Polite Language.Your comments inspired me to continue blogging. Your opinion much more valuable to me. Thank you.