What is Robots.txt

Robots.txt is a text file with instructions for web robots (typically search engine robots) about which areas of a website crawlers are allowed to search. However, this file does not protect against unauthorized access; use our free plugin Stop Bad Bots to block bad robots. These crawl instructions are specified by “disallowing” or “allowing” the behavior of certain (or all) user agents.

You can also integrate a link to your sitemap, which gives search engine crawlers an overview of all existing URLs of your site.

Robots.txt is stored in the root directory.

For example, the robots.txt file for the website https://www.mysite.com/ could look like this:

User-agent: *
Disallow: /login/
Sitemap: https://www.mysite.com/sitemap.xml

On this above example, you are telling to all search engines not craw your folder login and where is your file sitemap.xml.