How it works
Walk through the checklist - check the boxes for any middleware you know is in front of your origin, paste the HTTP status your llms.txt returns, and tell the tool whether robots.txt allows the path. The tool flags every blocker a crawler might hit on the way to your file and gives you the specific fix for each one. Every finding comes with a one-line remediation.
What blocks AI crawlers
- CDN bot challenges. Cloudflare bot-fight-mode, Akamai Bot Manager and similar services block anything flagged as automated - including the crawlers you actually want.
- Redirect chains. A single 301 from apex → www works for Google, but some AI crawlers stop at the first redirect and never reach the final URL.
- Auth walls. llms.txt behind a login page, HTTP basic auth, or any form of authentication is invisible to every crawler.
- robots.txt conflicts. A blanket Disallow in robots.txt overrides anything your llms.txt says. Crawlers check robots.txt first.
Pair with
Test the content with the LLMs.txt Tester, check hosting details with the Hosting Checker, and compare against your robots.txt. Strategy reading: do AI assistants follow links.