What is a robots.txt file?
A robots.txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site. It basically tells whether the bot should visit or not the particular page. robots.txt file is one of the key concept of Search Engine Optimization (SEO) to be more specific Technical SEO.
Want to know more Key Concept of Search Engine Optimization (SEO) click here
What is robots.txt used for?
robots.txt is used primarily to manage crawler traffic to your site.