How the validator works
Paste your llms.txt file in the text area and we check it line-by-line against the informal spec. The validator looks for the H1 site name, a one-line blockquote summary, section headings (## Primary sources, ## Bot policy), markdown link structure, and known AI user agents. It also flags the most common mistake - accidentally using robots.txt syntax (User-agent:, Allow:, Disallow:) inside a markdown file.
What to fix first
- Errors block parsing entirely - fix them first. The most common is leftover robots.txt directives.
- Warnings hurt citation quality - especially missing summary blockquotes (which is what assistants use as your canonical one-liner).
- Info notes are non-blocking but improve signal strength - listing bots by name gives assistants an explicit policy to follow.
Once your llms.txt passes
Upload it to https://yourdomain.com/llms.txt, then run our AI Bot Access Checker against your robots.txt to make sure you're not accidentally blocking the same crawlers you just allowed. The full context on llms.txt is in our what is llms.txt guide. Need a clean file from scratch? Use the LLMs.txt Generator.