A simple tool to generate an optimized robots.txt file that properly configures access for AI search engine crawlers like ChatGPT (GPTBot), Perplexity (PerplexityBot), Google Gemini, Claude, and more.
Prefer a web-based tool? Try our free online generators β no installation needed:
- AI Robots.txt Generator β Generate an optimized robots.txt for AI crawlers right in your browser
- AI Crawler Access Checker β Check if AI search engines can access your site
- Full GEO Scan β Comprehensive 11-signal AI search readiness scan
Many websites accidentally block AI search engine crawlers, making their content invisible to ChatGPT, Perplexity, and other AI-powered search engines. This tool helps you generate a properly configured robots.txt that:
- Allows all major AI crawlers to access your content
- Maintains your existing SEO crawler settings
- Follows best practices for Generative Engine Optimization (GEO)
Check your current robots.txt configuration: GEOScore Robots.txt Checker
# Clone this repo
git clone https://github.com/henu-wang/ai-robots-txt-generator.git
cd ai-robots-txt-generator
# Generate robots.txt
python generate.py --domain yourdomain.com --output robots.txtCopy the template below and customize it for your site:
# ============================================
# AI Search Engine Crawlers Configuration
# Generated by: https://github.com/henu-wang/ai-robots-txt-generator
# Guide: https://geoscoreai.com/blog/robots-txt-ai-crawlers
# ============================================
# OpenAI - ChatGPT & SearchGPT
User-agent: GPTBot
Allow: /
User-agent: ChatGPT-User
Allow: /
User-agent: OAI-SearchBot
Allow: /
# Perplexity AI
User-agent: PerplexityBot
Allow: /
# Google - Gemini & AI Overviews
User-agent: Google-Extended
Allow: /
# Anthropic - Claude
User-agent: ClaudeBot
Allow: /
# Apple - Apple Intelligence & Siri
User-agent: Applebot-Extended
Allow: /
# Meta AI
User-agent: Meta-ExternalAgent
Allow: /
# Cohere AI
User-agent: cohere-ai
Allow: /
# ByteDance - Doubao
User-agent: Bytespider
Allow: /
# ============================================
# Standard Search Engine Crawlers
# ============================================
User-agent: Googlebot
Allow: /
User-agent: Bingbot
Allow: /
User-agent: *
Allow: /
# ============================================
# Sitemap
# ============================================
Sitemap: https://yourdomain.com/sitemap.xml
| Crawler | Operator | Product | Documentation |
|---|---|---|---|
GPTBot |
OpenAI | ChatGPT | docs |
ChatGPT-User |
OpenAI | ChatGPT Browse | docs |
OAI-SearchBot |
OpenAI | ChatGPT Search | docs |
PerplexityBot |
Perplexity | Perplexity AI | docs |
Google-Extended |
Gemini/AI Overviews | docs | |
ClaudeBot |
Anthropic | Claude | docs |
Applebot-Extended |
Apple | Apple Intelligence | docs |
Meta-ExternalAgent |
Meta | Meta AI | docs |
cohere-ai |
Cohere | Cohere AI | - |
Bytespider |
ByteDance | Doubao/TikTok | - |
python generate.py --help# Generate with default settings (allow all AI crawlers)
python generate.py --domain example.com
# Output to file
python generate.py --domain example.com --output robots.txt
# Block specific crawlers
python generate.py --domain example.com --block GPTBot,Bytespider
# Allow only specific crawlers
python generate.py --domain example.com --allow-only GPTBot,PerplexityBot,Google-Extended# Add custom disallow paths
python generate.py --domain example.com --disallow "/admin,/api,/private"
# Include crawl-delay
python generate.py --domain example.com --crawl-delay 10
# Multiple sitemaps
python generate.py --domain example.com --sitemap "https://example.com/sitemap.xml,https://example.com/blog-sitemap.xml"After deploying your robots.txt, verify it works:
- Free scan: GEOScore β Scans your entire site for AI search readiness
- Robots.txt check: GEOScore Robots.txt Checker β Validates AI crawler access
- AI crawl check: GEOScore AI Crawl Checker β Tests if AI bots can reach your content
- Robots.txt for AI Crawlers Guide β Complete guide to configuring robots.txt for AI
- What is GEO? β Introduction to Generative Engine Optimization
- Awesome GEO β Curated list of GEO resources
- GEO Scoring Methodology β Open scoring methodology
PRs welcome! If you know of new AI crawlers or have suggestions for the generator, please open an issue or submit a PR.
- GEOScore AI Scanner β Check your website's AI search visibility across 11 signals
- AI Robots.txt Generator β Generate optimized robots.txt for AI crawlers
- AI Crawler Access Checker β Verify which AI bots can access your site
- awesome-geo β Curated list of GEO resources, tools, and guides
- geo-scoring-methodology β Open methodology for scoring AI search readiness
- geo-checklist β Interactive pre-launch GEO readiness checklist
- ai-crawlers-reference β Complete database of AI search engine crawler user-agents
- geo-badge-generator β Generate badges showing your GEO readiness score
- llms-txt-examples β Real-world llms.txt implementation examples by industry
- geo-config-examples β Ready-to-use AI search optimization configs for popular frameworks
- geo-case-studies β Real-world GEO optimization case studies with before/after data
- ai-search-readiness-framework β 11-signal AI search readiness evaluation framework
MIT License