Everyone Is Talking About llms.txt. Here's What Actually Matters.
A client asked us last month whether they needed an llms.txt file. They'd read three blog posts about it, watched a YouTube video, and were convinced it was the missing piece of their AI visibility strategy.
We told them the truth: probably not. At least, not for the reasons they thought.
The llms.txt file is a real proposal with a real specification. But the gap between what people think it does and what it actually does right now is wide enough to drive a truck through. So let's close that gap.
What Is llms.txt?
The llms.txt file is a proposed standard created by Jeremy Howard (co-founder of Answer.AI and fast.ai) in September 2024. The idea is simple: give AI models a structured, markdown-formatted summary of your website so they can understand your content quickly without crawling every page.
Think of it like a robots.txt for AI comprehension rather than access control. Where robots.txt says "you may or may not crawl these pages," llms.txt says "here's what this site is about, and here's where to find the important stuff."
The file lives at your site root: https://yoursite.com/llms.txt
The Format
The spec uses markdown with a specific structure. Here's what a real llms.txt file looks like:
# Your Company Name
> Brief description of what your company does.
> Include the key details an AI model would need
> to understand your business in 2-3 sentences.
## Docs
- [Getting Started](https://yoursite.com/docs/start): Quickstart guide for new users
- [API Reference](https://yoursite.com/docs/api): Full API documentation
- [Pricing](https://yoursite.com/pricing): Plans and pricing details
## Blog
- [How We Built X](https://yoursite.com/blog/built-x): Technical deep dive on architecture
- [Industry Report 2026](https://yoursite.com/blog/report): Original research on market trends
## Optional
- [Changelog](https://yoursite.com/changelog): Version history
- [Community](https://yoursite.com/community): User forum and discussions
The rules are straightforward:
- Start with an H1 containing your site or project name (the only required element)
- Add a blockquote with a short summary
- Use H2 headers for sections
- Each section contains a markdown list of links with optional descriptions after a colon
- An "Optional" section flags content that can be skipped when context length is limited
There's also llms-full.txt, which is the full content of your site in a single file rather than just links and summaries. Some sites provide both.
What People Think It Does
The SEO and marketing world picked up llms.txt and ran with it. The pitch usually sounds like this: "Add this file, and ChatGPT and Perplexity will understand your site better, recommend you more often, and cite you in AI-generated answers."
That would be great if it were true.
What It Actually Does (Right Now)
Here's where we get honest.
SE Ranking analyzed nearly 300,000 domains and found no measurable relationship between having an llms.txt file and being cited in AI-generated answers. None. OtterlyAI tracked 62,100 AI bot visits over 90 days and found that only 0.1% of AI crawler requests touched /llms.txt. The file received fewer visits from AI bots than the average content page on the same site.
Google has explicitly said its AI Overviews rely on traditional search signals, not llms.txt. John Mueller confirmed in 2025 that no AI system uses it. OpenAI hasn't announced support either. Neither has Anthropic. Neither has Perplexity.
Let that sink in. The major AI platforms that people want to optimize for have not adopted this standard.
In our own AI visibility audits, we've seen no difference in AI recommendation rates between sites with llms.txt and sites without it. The factors that actually move the needle are content clarity, structured data, and not blocking AI crawlers in robots.txt.
Where It Does Have Value
That said, llms.txt isn't useless. It just isn't what the hype suggests.
If you run developer documentation, an API, or a knowledge-heavy product, llms.txt is genuinely useful for AI-powered coding assistants and agent workflows. Tools like Cursor, GitHub Copilot, and custom AI agents can use llms.txt to quickly understand a codebase or documentation structure. That's the use case Jeremy Howard originally designed it for.
It's also helpful as a simple exercise. Writing an llms.txt file forces you to articulate what your site is about, what the most important pages are, and how they relate to each other. That clarity benefits your content strategy regardless of whether any AI crawler reads the file.
Should You Create One?
Here's our honest recommendation, based on running AI visibility audits for dozens of companies.
If you have developer docs, an API, or technical documentation: yes. Create one. AI coding tools and agents are more likely to use it than search-focused AI platforms, and it takes 15 minutes.
If you're a regular business website hoping it will boost your ChatGPT recommendations: manage your expectations. There's no evidence it helps with that today. It doesn't hurt either. The file costs nothing to create and maintain. But spend your time on the things that actually matter first.
Those things are:
- Unblock AI crawlers. Check your robots.txt for GPTBot, ClaudeBot, and Google-Extended. About a third of the sites we audit block at least one without knowing it.
- Add structured data. Organization, Product, Service, and FAQ schema in JSON-LD. This is what AI models actually parse today.
- Write clearly. State what you do, who you serve, and what makes you different, in plain language, on your homepage and key landing pages.
Do those three things first. Then add llms.txt if you want. In that order.
How to Create an llms.txt File
If you've decided to go ahead, here's how to do it right.
Step 1: Write Your Summary
Start with your company name as H1 and a blockquote that explains what you do. Be specific. "We make project management software for construction teams" beats "We empower organizations with innovative solutions."
Step 2: List Your Important Pages
Group your key pages into logical sections. Prioritize pages that contain information an AI would need to accurately describe your business: product pages, pricing, documentation, case studies.
Step 3: Deploy It
Save the file as llms.txt at your domain root so it's accessible at https://yoursite.com/llms.txt. If you have extensive content, consider also creating llms-full.txt with the complete text of your key pages in a single markdown file.
Step 4: Don't Obsess Over It
Update it when you add major new pages or change your product. But don't check your analytics daily hoping for AI bot visits. That 0.1% request rate from OtterlyAI's study should calibrate your expectations.
What To Do About It
The llms.txt standard might matter in the future. The spec is well-designed. If major AI platforms decide to adopt it, having the file ready puts you ahead. But right now, treating it as a critical AI visibility lever is putting the cart before the horse.
Focus on the fundamentals: clear content, proper structured data, open access for AI crawlers, and regular audits to see where you actually stand. Those are the things that determine whether ChatGPT, Gemini, and Perplexity recommend you today.
If you're not sure where your site stands with AI platforms, find out before you start optimizing. You might be invisible for reasons that have nothing to do with llms.txt.
Run the free AI visibility scan to check your site in 60 seconds.