TL;DR: Robots.txt is a file used by websites to communicate with web crawlers and other web robots, indicating which parts of the site should not be processed or scanned. It helps control the access of these bots to specified areas of a website.

Robots.txt is a text file belonging to a group of web protocols called Robots Exclusion Protocol or REP. This file commands search engine bots like Googlebots on what pages to crawl and index. Robot.txt helps websites avoid being bombarded by too many requests. It plays a huge role in having a good SEO score.

Updated February 17, 2024

Axel Grubba is the founder of Findstack, a B2B software comparison platform, with his background spanning management consulting and venture capital where he invested in software. Recently, Axel has developed a passion for coding and enjoys traveling when he is not building and improving Findstack.