Find out which AI crawlers your robots.txt is blocking. Inspect every User-agent group, see matched rules with line numbers, and test if a specific URL is allowed or blocked.
/robots.txt from the domain.
Paste any URL from a site (even the homepage) and we will fetch its robots.txt automatically. Or paste your robots.txt content directly.
See which AI crawlers, GPTBot, ClaudeBot, PerplexityBot, Google-Extended, CCBot and more, are blocked on your site with the matched rule and line number.
Drop in any URL on the site to see if it is allowed or blocked, which User-agent group was matched, and exactly which Allow/Disallow line applied.
Robots.txt is just the first signal. Get a free technical & AI-visibility audit and make sure the right bots can (or cannot) access your content.