Robots.txt Generator
Create a perfectly formatted robots.txt file for your website. Choose a preset or build custom rules for each bot.
User-agent: * Disallow: /admin Disallow: /private Disallow: /api User-agent: GPTBot Disallow: / User-agent: ClaudeBot Disallow: / User-agent: Bytespider Disallow: /
No sitemap URL specified. Adding a sitemap helps search engines discover your pages.
How It Works
Choose a preset
Start with a recommended preset or go fully custom to define rules for each bot.
Add user-agent rules
Specify which paths to allow or disallow for each crawler, including AI bots.
Copy or download
Copy your robots.txt to clipboard or download it, then upload to your site root.
Why Use a Robots.txt File?
A robots.txt file tells search engine crawlers which pages or sections of your site they can or cannot access. It is essential for controlling how bots interact with your website.
With the rise of AI crawlers like GPTBot and ClaudeBot, having a well-configured robots.txt is more important than ever. You can selectively block AI training bots while keeping your site indexed by search engines.
Related Free SEO Tools
Robots.txt AI Bot Checker
Check which AI crawlers are blocked or allowed by your existing robots.txt file.
Noindex Checker
Verify which pages on your site have noindex tags preventing search engine indexing.
XML Sitemap Analyzer
Analyze your XML sitemap for URL count, lastmod dates, and potential issues.