Robots.txt Generator

Generate a robots.txt file with custom rules, presets, and AI bot blocking.

Quick Presets

Rules (1)

Allow paths

Disallow paths

Common paths (click to add to Disallow):

robots.txt

User-agent: *

Tips

User-agent: * applies to all crawlers

Disallow: / blocks the entire site

Allow: / explicitly permits all paths

• Place this file at the root: example.com/robots.txt

About Robots.txt Generator

Generate a valid robots.txt file with a visual rule editor. Configure per-bot allow/disallow rules, crawl delays, sitemaps, and use presets to allow all, block all, or specifically block AI crawlers.

How to Use

  1. 1Choose a preset (Allow All, Block All, Block AI bots) or start with Custom.
  2. 2Add rules for specific bots using the User-Agent selector or type a custom name.
  3. 3Add Allow and Disallow paths, or click common paths to add them quickly.
  4. 4Optionally set a Crawl-delay and add your Sitemap URL.
  5. 5Copy or download the generated robots.txt file.

Frequently Asked Questions

What is robots.txt?
robots.txt is a plain text file placed at the root of your website that tells web crawlers which pages or sections they can or cannot crawl. It is part of the Robots Exclusion Protocol.
Does robots.txt stop all bots?
No. robots.txt is a voluntary standard — well-behaved bots like Googlebot follow it, but malicious scrapers may ignore it entirely. For sensitive content, use authentication instead.
How do I block AI crawlers?
Use the "Block AI bots" preset to add Disallow: / rules for GPTBot (OpenAI), ChatGPT-User, ClaudeBot, and CCBot, while keeping the site crawlable for SEO bots.
Where do I put robots.txt?
Place it at the root of your domain: https://example.com/robots.txt — it must be accessible from that exact URL for crawlers to find it.