Robots.txt Generator
Easily create or edit a robots.txt file to guide search engine crawlers on your website.
Quick Templates
General Settings
Enter your sitemap URL to help search engines find your content
Comma-separated list of directories to block from all bots
Search Engines
AI Training Bots New
Note: These bots collect data to train AI models. Blocking them prevents your content from being used in AI training datasets.
Explicitly allow these directories (takes precedence over Disallow)
One bot name per line. These will be blocked from accessing your entire site.
Legacy & Archive Bots
Generated robots.txt
How to Use Your Robots.txt
- Download the file using the button above
- Upload to your website's root directory (e.g., https://example.com/robots.txt)
- Test it using Google's robots.txt Tester
- Submit your sitemap to search engines via Google Search Console
What is Robots.txt?
The robots.txt file is a text file webmasters create to instruct web robots (like search engine crawlers) how to crawl pages on their website. It's part of the Robots Exclusion Protocol (REP).
Common Uses
- • Block access to private/admin areas
- • Prevent duplicate content issues
- • Control crawl budget for large sites
- • Block AI training bots
Important Notes
- • Not a security mechanism (files are still accessible)
- • Some bots may ignore robots.txt
- • Must be located at site root
- • Changes may take time to propagate
Was this result helpful?
Thank you for your feedback!
Your input helps us improve our tools.