Robots.txt Generator
Generate robots.txt files to control how search engines crawl your website. Block specific directories, allow certain bots, and set crawl rules.
How to Use
- Select user agents to configure
- Add allow/disallow rules
- Set crawl delay if needed
- Add your sitemap URL
About robots.txt
The robots.txt file tells search engine crawlers which pages or sections of your site they can or cannot access. It is placed in your website root directory.
Use "Disallow" to block crawlers from specific paths like /admin/, /private/, or /tmp/. Use "Allow" to permit access to specific files within blocked directories.
The crawl-delay directive tells bots to wait between requests, which can help reduce server load. However, Google ignores this directive.
Always reference your sitemap in robots.txt using the Sitemap directive. This helps search engines find your sitemap without needing to submit it manually.