Flash deal ends in--:--:--— use GEO10 for 10% off

LLM InfrastructureFree · runs in your browser

Test llms.txt reachability and directives

LLMs.txtTester

Fetch any live domain's llms.txt and test whether it's reachable, correctly formatted and what directives are being sent to AI crawlers.

Browsers can't fetch third-party URLs directly (CORS). Paste the file contents below, then copy the curl command to verify live reachability from your terminal.

Verify live with curl

curl -sSL -A "GPTBot" -D - "https://acme.com/llms.txt"

Test results

9 pass0 warn0 fail

Domain-root URL pattern

pass

URL resolves to /llms.txt at the domain root - correct location

File has content

pass

453 characters parsed

Starts with H1 brand name

pass

First line is "# Acme"

Has summary blockquote

pass

Blockquote summary found - good for model intake

Declares at least one User-agent

pass

4 user-agent blocks: GPTBot, ClaudeBot, PerplexityBot, CCBot

Has Allow/Disallow directives

pass

3 Allow, 1 Disallow directives

References a Sitemap

pass

Sitemap directive found

Primary source list

pass

3 markdown link(s) to primary sources

File size under 100 KB

pass

0.4 KB - well within crawler limits

Directives seen by crawlers

GPTBotClaudeBotPerplexityBotCCBot

How it works

Because browsers can't fetch arbitrary third-party URLs (CORS), this tester runs two steps. First, paste your llms.txt content into the box below and the tool parses every directive, checks structure, counts user-agent blocks, and flags anything missing. Second, copy the generated curl command and run it from your terminal to confirm the file is actually reachable with the correct headers - the curl line impersonates GPTBot so you see exactly what OpenAI's crawler sees.

What the tester checks

  • Correct location. llms.txt must sit at the domain root, not a subdirectory or CDN subpath.
  • Valid structure. H1 brand name, blockquote summary, primary sources, user-agent blocks - these are the pieces models actually consume.
  • Directive parity. Counts Allow/Disallow rules and names every user-agent mentioned so you can spot gaps (missing ClaudeBot, no Perplexity block).
  • Crawler-safe size. Files above 100 KB get truncated by some crawlers, so the tester flags bloat early.

Pair with

After testing, validate syntax with the LLMs.txt Validator, check hosting with the AI Bot Access Checker, and cross-reference your robots.txt vs llms.txt to make sure they don't contradict. Strategy reading: what is llms.txt.

Want this done for you?

Ship the full GEO playbook in 14 days

Geolify GEO packages bundle every tool on this site into a 14-day done-for-you build - llms.txt, schema, entity strength, content overhaul, citations and the measurement stack. From $499.

Explore More Packages

Combine services for maximum AI visibility.