What is Robots.txt?

Robots.txt is a text file which contains few lines of simple code. It is saved on the website or blog’s
server which instruct the web crawlers to how to index and crawl your blog in the search results. That
means you can restrict any web page on your blog from web crawlers so that it can’t get indexed in
search engines like your blog labels page, your demo page or any other pages that are not as
important to get indexed. Always remember that search crawlers scan the robots.txt file before crawling
any web page.
Each blog hosted on blogger have its default robots.txt file which is something look like this:
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://example.blogspot.com/feeds/posts/default?orderby=UPDATED
•    Enjoy blogging and have fun