Free Robots.txt Tester

Test your robots.txt rules with one input. Paste a domain or full URL and instantly see if the page is crawlable.

Googlebot simulationCrawl-rule diagnosticsSEO + AI crawler friendly

We automatically detect the domain and path, then test with Googlebot.

What is a Robots.txt File?

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. It's used mainly to avoid overloading your site with requests.

Robots.txt Best Practices

  • Place the file in the root directory (e.g., example.com/robots.txt).
  • Include a link to your XML sitemap.
  • Don't use robots.txt to hide pages from Google Search results. Use noindex meta tags instead.

Robots.txt and AI Crawlers

Many AI crawlers respect robots.txt directives. Keep your rules explicit and avoid non-standard patterns so both search engines and LLM crawlers can interpret your policy correctly.