What is robot.txt?

Robots.txt is a text file that gives instructions to the search engine crawlers about the indexing of a webpage, domain, directory or a file of a website. It is generally used to tell spiders about the pages that you don’t want to be crawled. It is not mandatory for search engines, yet search engine spiders follow the instructions of the robots.txt.

The location of this file is very significant. It must be located in the main directory otherwise the spiders will not be able to find it as they do not search the whole site for a file named robots.txt. They only check the main directory for these files, and if they don’t find the files in the main directory, they assume that the site does not have any robots.txt file and index the whole site.

Share: