What is a robots.txt file?

A robots.txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site. It basically tells whether the bot should visit or not the particular page. robots.txt file is one of the key concept of Search Engine Optimization (SEO) to be more specific Technical SEO.

Want to know more Key Concept of Search Engine Optimization (SEO) click here

What is robots.txt used for?

robots.txt is used primarily to manage crawler traffic to your site.

Example of robots.txt file with two rules:

# Group 1

User-agent: Googlebot

Disallow: /label/

# Group 2

User-agent: *

Allow:

/Sitemap: http://www.yourwebsite.com/sitemap.xml

Let us Understand What is Going On:

1) The user agent named “Googlebot” crawler should not crawl the folder http://yourwebsite.com/label/ or any subdirectories.

2) All other user agents can access the entire site.

3) The site’s Sitemap file is located at http://www.yourwebsite.com/sitemap.xml

Read More About robots.txt : https://support.google.com/webmasters/answer/6062608?hl=en

5 thoughts on “ What is a robots.txt file ? ”

Leave a Reply