How To Use Robots Text For Better Seo In Blogger

Using robots text for better seo in blogger | 101helper

Robots text plays a main role in a blog's seo. With robots text we actually indicate search engines robots how to crawl our blogs and how to index it. Robots text is not the meta robot tag codes, its a text which is added in a blogs settings option which helps us to tell robots which pages to crawl and index and which do not. Previously I have created a post in which I have told how to add custom robot text in blogger, if you have already added robots text in your blog then you don't need to read that post but if you haven't added robots text in your blog yet then your first task is to read that post and after adding robots text in your blog you can follow this post to use robots text in proper way. In this post I will tell you how to allow or disallow robots to index pages or labels in search results.


Recommended: How to set custom robots header tags for better seo in blogger.

Allowing or disallowing labels in search results:


I am starting from allowing or disallowing robots to index or crawl labels/categories of your blog e.g in-case of my blog Seo, Menus and Blogging etc. So first of all go to Blogger Dashboard > Settings > Search preferences > Crawlers and indexing > Custom robots.text and edit it. For instance see below image:



Using robots text for better seo in blogger | 101helper

If you have read my post about adding custom robots text then your robots text will be similar to below code:

User-agent: Mediapartners-GoogleDisallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://Yourblog.blogspot.com/feeds/posts/default?orderby=UPDATED

In the about code I have added /search beside Disallow so now robots will not index any of your category, if your blog have sitelinks in search results and you wish to show only categories of your blog then remove /search and you are done!

How does it work?

It works in a very simple way how? lets see an example of link of a category opened in browser for details to learn how does it work. See below image:

Using robots text for better seo in blogger | 101helper

In the above example I have opened my Seo label in browser, you can see the highlighted part of link, it lies in every category so to block a label in search results you have to add /search beside disallow. On more thing about adding /search is that is not only blocks labels to be indexed in results but also any queries made for search in blog is also not allowed to be indexed because if you take a look of link of a query made in blog it also has the same text(/search). See example below:

Using robots text for better seo in blogger | 101helper

In the above example 101helper is a query made for search, so if you wish to block only labels and not the search queries you could you /label instead of /search. It is the proper way to block only labels.

Allowing or disallowing pages in search results:

Now a days every blog has pages like contact, about, copyright, sitemap etc, but nobody want to index pages like contact, about and copyright I am not sure about Sitemap :p. The reason not to index contact and about pages is they aren't useful for visitors and aren't interesting too so now you can control robots to index your pages or no but keep one thing in mind that you can't control a specific page to be indexed or not so if you add /p in front of Disallow: then every page of your blog won't be indexed. See example of link of a blogger page:

Using robots text for better seo in blogger | 101helper
  
You can use Robots text tester in Google webmaster tools while making any changes in your blog's robots text. So you can see a live preview of your blog's robot text at Google webmaster tools > Crawl > robots.txt tester and can test every url of your blog after making any changes as if it is allowed by robots or not. See an example of page blocked by Google robots in below image:

Using robots text for better seo in blogger | 101helper

If you wish to block all pages in search results just add /p in front of Disallow:. It will block only contact, about, sitemap, privacy policy etc pages not posts. You can't block or allow a specific page of your blog to be indexed or not.

Blocking both pages and labels in search results:

If you want to block both pages and categories/labels the your robots text should look like below code:

User-agent: Mediapartners-Google
User-agent: *
Disallow: /p
Disallow: /search
Allow: /
Sitemap: http://101helper.blogspot.com/feeds/posts/default?orderby=UPDATED

the above code specifies not to index a link having /p or /search. So now robots won't index any label or pages.

Allowing all pages, labels and Homepage to indexed in search results:

If you are desired to index all of your blog pages, labels, homepages, post pages etc use below code as your blog's robots text. 

User-agent: Mediapartners-Google
User-agent: *
Disallow:
Allow: /
Sitemap: http://101helper.blogspot.com/feeds/posts/default?orderby=UPDATED

The above code tells robots to index all pages, labels posts, homepage etc.

Final words:

You can use robots text for blocking posts published in a specific year(you can block posts published in a same year). Robots text plays a vital role in a blogs seo. Use robots text carefully or else you can loose all of your blog traffic.

Hope you like this post and it was an informative article. If you have any question feel free to ask me in comments. Follow and subscribe to get latest updates about blogger seo. Thanks for visiting 101Helper. Share this post with others and help me spread 101Helper.

Search tags: Robots.txt, Using robots text for better seo in blogger, Robots.txt for seo, Seo tips for blogger, 101Helper seo tips for blogger, How to use robots text to index posts in google, how to edit sitelinks in google search results.
Previous
Next Post »