First of all, go to Blogger.com >> your site >> Settings
>> Search engine Preference >> Crawlers and indexing. Now
you will be able to see two options i.e. Custom robots.txt and Custom
robots header tags. These two options would provide you the flexibility
to customize your robot.txt file.
- Custom robots.txt: This option provides you the ability to edit your whole robot.txt file. You can just type your content whether you want the content to be crawled by spiders or not. However, You can always undo your actions and can always revert back to normal.
- Custom robots header tags: This option is a bit completed. It does not provide you the ability to write your codes instead it provides few options with the check boxes. So, if have no idea about head tags then stay away from it.
Now you have to enable the custom robot.txt file so press the edit
button which is present next to the “Custom robots.txt” option. After
selecting the edit button, it would ask you to enable custom robots.txt
content so press “Yes” and proceed to the next step.
In the text area, type the content which you want to exclude from being
crawled. After editing the file according to your needs, press the save
button to conclude. However, if you want to revert back to the default
robot file then, you can select “No” instead of “Yes.
For Example:
User-agent: *
Disallow: /about
User-agent: *
Disallow: /contact
User-agent: *
Disallow: /services
We hope this tip would help you. If you are having any issue regarding
crawling then, feel free to leave your questions in the comments below
and our experts would try to help you in solving them.
Dear readers, after reading the Content please ask for advice and to provide constructive feedback Please Write Relevant Comment with Polite Language.Your comments inspired me to continue blogging. Your opinion much more valuable to me. Thank you.