FileConverterPro

Robots.txt

Generate a robots.txt file online for free. Set allow/disallow rules, sitemap URL, and crawl delay. Download or copy. No signup.

Runs entirely in your browser — files never uploaded

Rule 1

Preview

User-agent: *
Disallow:

How to use Robots.txt

  1. 1. Add user-agent rules. Enter the bot name (or * for all bots) and specify which paths to allow or disallow.
  2. 2. Set global options. Add a sitemap URL and optional crawl delay that apply to every rule.
  3. 3. Use presets for speed. Click "Allow all", "Block all", or "Block AI bots" to start from a template.
  4. 4. Copy or download. Preview the result, then copy to clipboard or download as a robots.txt file to upload to your site root.

FAQ

Where do I put robots.txt?

Upload it to the root of your website so it's accessible at https://yoursite.com/robots.txt. Most web hosts let you drop it in the public or root folder.

Does robots.txt block pages from appearing in Google?

It tells crawlers not to fetch the page, but Google may still index the URL (without content) if other sites link to it. Use a noindex meta tag for stronger removal.

What is Crawl-delay?

It requests that crawlers wait a set number of seconds between requests. Google ignores it, but Bing and some others respect it.

Which AI bots does the "Block AI bots" preset cover?

GPTBot (OpenAI), ChatGPT-User, CCBot (Common Crawl, used by many AI training sets), Google-Extended (Gemini training), and anthropic-ai (Anthropic).

Related tools

Advertisement