How the checker works
Paste your robots.txt file and we parse every User-agent: group, then check each of the 16 major AI crawlers against the rules. The verdict is one of: allowed (crawler can index everything), partial (crawler is allowed but some paths are blocked), blocked (crawler is fully locked out), or not-mentioned (crawler falls back to the wildcard group). Not-mentioned is usually fine - but if your wildcard group is Disallow: /, your AI visibility is zero.
How to fix it
- If a critical bot is blocked, add an explicit group:
User-agent: GPTBot Allow: / - If the wildcard group blocks everything, your site is invisible to every assistant - fix the wildcard before worrying about per-bot rules.
- Be intentional about Bytespider and CCBot - many sites block these without losing AI visibility, because the major assistants have their own named crawlers anyway.
- Pair robots.txt with a well-formed llms.txt file for the prose context layer that robots.txt can't express.
Why this matters for AI search
If a crawler is blocked, the assistant that owns it can't index your content - which means you can't be cited, quoted, or summarized in answers from that assistant, no matter how good your content is. The full mechanism is in our do AI assistants follow links guide and why your site isn't in ChatGPT.