Key Takeaway
An llms.txt file is a plain-text document deployed at yourdomain.com/llms.txt that directly tells AI language models what your website covers and which pages matter most — giving you explicit control over how ChatGPT, Claude, and Perplexity understand your site.
What Is llms.txt and Why Every Website Needs One
There’s a new file your website needs — and no, it isn’t another sitemap variant. It’s not robots.txt either. Most sites don’t have it yet, which means this is still a real first-mover advantage. It’s called llms.txt, and it’s fast becoming the standard way to tell AI engines exactly what your website is about.
If appearing in ChatGPT answers, Perplexity results, and Google AI Overviews matters to you, this file is one of the quickest wins available right now. Creating one takes less time than writing a blog post.
Key Takeaways
- What Is llms.txt?
- Why Does llms.txt Matter for GEO?
- What Does an llms.txt File Look Like?
- A Real-World Example
What Is llms.txt?
llms.txt is a plain text file you place at the root of your website — yoursite.com/llms.txt — that gives language models a structured summary of what your site is, what it covers, and which pages deserve attention.
The closest analogy is a sitemap designed for AI rather than search crawlers. A traditional XML sitemap tells bots which URLs exist. llms.txt tells language models what those pages mean, who they’re for, and why any of it matters. That’s a fundamentally different type of signal.
Jeremy Howard, co-founder of fast.ai, proposed the format and it’s been picked up quickly among AI-aware developers and founders. It isn’t yet a formal standard — but it’s already being read by major AI engines, and that’s what counts.
Why Does llms.txt Matter for GEO?
When a language model crawls your site, it has to infer nearly everything. What does this brand actually do? Who are its customers? Which of these 200 pages is the most important one? The inference process is imperfect by design — content gets misclassified, key pages get overlooked, and your core value proposition gets diluted across thousands of words of HTML that the model has to interpret cold.
llms.txt short-circuits all of that. Instead of hoping the model figures it out, you give it a curated, authoritative summary written by you. The difference, in practice, is significant.
In the SearchScore GEO audit framework, llms.txt is scored in both the AI Citability category (worth 20 points) and the Topical Authority category (worth 15 points) — making it the highest-value technical fix available on a per-effort basis. Sites without one are leaving a lot of GEO score on the table for no good reason.
What Does an llms.txt File Look Like?
The format is deliberately simple. This is the basic structure:
# Brand Name > A one or two sentence description of your site: what it does, who it’s for. ## Key Pages - [Page Name](https://yoursite.com/page): Brief description of what this page covers - [Blog](https://yoursite.com/blog): Your content hub on [topic] - [Product](https://yoursite.com/product): What your product does ## About Brief paragraph about your organisation, expertise, and what makes your content credible. ## Topics Covered - Topic 1 - Topic 2 - Topic 3
That really is all you need to get started. Some sites go further — team bios, content categories, data sources, editorial standards — but even the stripped-back version above is a meaningful improvement over having nothing. Start simple and build on it.
A Real-World Example
Here’s a simplified version of the llms.txt used at SearchScore.io:
# SearchScore > SearchScore analyses any website and gives it an AI Visibility Score out of 100, checking 56 signals across seven categories. Used by SEO professionals, marketers and founders to optimise for ChatGPT, Perplexity, Google AI Overviews and Claude. ## Key Pages - [AI Visibility Score Tool](https://searchscore.io): Free GEO audit tool - [Leaderboard](https://searchscore.io/leaderboard): Top-scoring websites by industry - [Blog](https://searchscore.io/blog): GEO guides and AI search research ## Categories Audited - AI Citability - Brand Authority - Content and E-E-A-T - Technical Foundations - Structured Data - Platform Optimisation - Topical Authority
Clear, specific, structured. An AI engine reading this knows what SearchScore does, what its key pages are, and what topics it covers — all in under 200 words. That’s the goal.
How to Create Your llms.txt in 20 Minutes
Step 1: Open a plain text editor
Not Word. Not Google Docs. Plain text. Notepad, VS Code, TextEdit in plain text mode — anything that won’t add invisible formatting.
Step 2: Write your header
# [Your Brand Name] > [One or two sentences describing your site: what it does and who it’s for.]
Step 3: List your key pages
Pick 5 to 10 pages that matter most. Your homepage, product pages, core service pages, and your strongest content.
## Key Pages - [Homepage](https://yoursite.com): [Brief description] - [Product Name](https://yoursite.com/product): [Brief description] - [Blog](https://yoursite.com/blog): [Brief description]
Step 4: Add an about section
Two or three sentences about your organisation, your expertise, and what makes your content credible. This is where you give AI engines a reason to trust what you’ve written.
Step 5: Upload to your site root
Save as llms.txt (not llms.txt.txt — a surprisingly common mistake) and upload to your domain root. Test it by visiting yoursite.com/llms.txt in a browser. You should see plain text, nothing else.
Step 6: Reference it in your existing files
Add a pointer in your robots.txt:
# AI Engine Guidelines LLMs-txt: https://yoursite.com/llms.txt
Common Mistakes to Avoid
- Saving as HTML instead of plain text. The file must be plain text. No HTML tags, no formatting, no exceptions.
- Putting it at /llms/ instead of /llms.txt. The path matters. Root level, correct extension.
- Being vague. “We help businesses grow” tells an AI engine absolutely nothing. Be specific about what you do and for whom.
- Setting and forgetting it. When you add major new pages or content categories, update your llms.txt to reflect that.
- Making it too long. A focused 300-word file consistently outperforms a rambling 3,000-word one. Precision beats comprehensiveness here.
What Happens After You Add llms.txt?
AI engine crawlers revisit sites on a variable schedule — anywhere from days to months, depending on your site’s authority and how frequently you publish. Once GPTBot, ClaudeBot, and PerplexityBot recrawl your site and read the file, they have a cleaner picture of your content. The guesswork is gone.
You can verify your llms.txt is live and correctly formatted by running a free GEO audit at SearchScore.io — it checks your llms.txt as part of the AI Citability score and flags any issues immediately.
The Bigger Picture
llms.txt is one component of a broader GEO strategy. On its own, it won’t guarantee you appear in AI answers — nothing does that in isolation. Combined with proper bot access, solid schema markup, strong E-E-A-T signals, and deep topical content, it’s part of a complete AI visibility framework.
But of everything in that framework, it’s the easiest win. Most of your competitors haven’t got one. You can have one within the hour. In optimisation work, that combination — high impact, low effort, minimal competition — doesn’t come along often.
Build it today. Check your full GEO picture at SearchScore.io. Then work through the rest of the framework.
More in This Series
- ↑ Pillar: What Is GEO? The Complete Guide
- How to Appear in ChatGPT Answers
- Schema Markup for AI Search
- E-E-A-T for AI Search
- GEO vs SEO: What’s the Difference?
Frequently Asked Questions
What is llms.txt and what does it do?
llms.txt is a plain-text file placed at the root of your website that acts as a guide for AI language models. It describes what your site is about, lists your most important pages, and explains your key topics — helping AI systems understand your content without having to infer it from scattered web pages.
How is llms.txt different from robots.txt?
robots.txt controls which crawlers can access which pages — it is about permissions. llms.txt is about context and content — it tells AI models what your site is about and which content to prioritise. They serve different purposes and you need both: robots.txt to allow access, llms.txt to guide understanding.
How do you create an llms.txt file?
Create a plain text file with a site description, an About section, a list of important URLs with annotations, and a topics list. Save it as llms.txt and deploy it to yourdomain.com/llms.txt. The whole process typically takes under 30 minutes and requires no technical expertise beyond basic file editing.
What Is llms.txt and Why Every Website Needs One
About the Author
Ronnie Huss is a serial founder and AI strategist based in London. He builds technology products across SaaS, AI, and blockchain. Learn more about Ronnie Huss →
Follow on X / Twitter · LinkedIn
Written by
Ronnie Huss Serial Founder & AI StrategistSerial founder with 4 successful product launches across SaaS, AI tools, and blockchain. Based in London. Writing on AI agents, GEO, RWA tokenisation, and building AI-multiplied teams.