What is llms.txt?
The llms.txt file is a text file placed at the root of your website (yoursite.com/llms.txt) that provides AI crawlers with a structured summary of your site: who you are, what content is available, and which is most important.
Think of it as an introduction letter addressed to AIs. Without this file, AI crawlers must guess your site's structure and hierarchy. With it, you give them a clear map.
The specification was proposed by Jeremy Howard (co-founder of fast.ai) and rapidly adopted by the European SEO community. Agencies such as Re:signal (London) and Mindshare (Europe) have been recommending its implementation since late 2024.
In 2026, the main AI crawlers (GPTBot, OAI-SearchBot, PerplexityBot, Google-Extended) consult the llms.txt file when it is present. It is not an official standard in the W3C sense, but it has become a de facto standard.

Why your site needs it
Three fundamental reasons:
1. Guiding AI crawlers. Without llms.txt, AI crawlers treat all your pages equally. With it, you indicate which pages are most relevant and should be indexed as a priority. It is a form of "SEO for AI".
2. Structured context. The llms.txt file includes a description of your company/site that helps AI understand your domain of expertise. It is a thematic authority signal.
3. Competitive advantage. As of March 2026, fewer than 5% of European sites have an llms.txt file. Early adopters benefit from a significant advantage in terms of AI visibility.
The results are measurable. An internal study by Ryte (Munich) showed that sites with a well-configured llms.txt saw an 18% increase in their AI citations in the 8 weeks following implementation, all else being equal.
Specification and format
The llms.txt file follows a simplified Markdown format. Here is the structure:
| Section | Required | Description |
|---|---|---|
| Title (H1) | Yes | Site/company name |
| Description | Yes | Paragraph describing the site, its domain, its expertise |
| ## Detailed info | No | Link to a long version (llms-full.txt) |
| ## Sections | Recommended | List of main site sections with URLs |
| ## Key pages | Recommended | Most important pages to index as a priority |
| ## Optional | No | Secondary pages, archives, etc. |
Concrete example for a B2B site:
# AI SOS - AI Visibility for Companies
> AI SOS is an AI visibility platform (AEO/GEO) that helps European B2B companies
> appear in responses from ChatGPT, Perplexity, Google AI Overview and Gemini.
> Based in Belgium, the team deploys Generative Engine Optimization strategies
> combining audit, technical optimisation, content creation and continuous monitoring.
## Detailed info
- [Full version](https://aisosystem.com/llms-full.txt): Detailed description of services and methods
## Key pages
- [Home](https://aisosystem.com/): Platform and offer presentation
- [Solution](https://aisosystem.com/#solution): Detail of the AI SOS method
- [Pricing](https://aisosystem.com/#tarifs): Plans and pricing
- [Blog](https://aisosystem.com/blog): Articles on AI visibility, AEO, GEO
## Blog - AI Visibility
- [Complete AI visibility guide](https://aisosystem.com/blog/visibilite-ia-guide-complet): Foundational guide
- [What is AEO](https://aisosystem.com/blog/qu-est-ce-que-aeo-answer-engine-optimization): AEO definition
- [GEO guide](https://aisosystem.com/blog/geo-generative-engine-optimization-guide): Generative Engine Optimization
Step-by-step implementation
Here is how to implement llms.txt in 30 minutes:
Step 1: Create the file. Create a text file named llms.txt in simplified Markdown format (see specification above).
Step 2: Write the description. Write a concise but complete description of your site. Include: name, domain of expertise, location, value proposition. This description will be read by LLMs to understand who you are.
Step 3: List priority pages. Identify your 10-20 most strategic pages and list them with a descriptive title and URL. These are the pages AI crawlers will index as a priority.
Step 4: Deploy. Place the file at the root of your site: yoursite.com/llms.txt. Verify it is publicly accessible.
Step 5: Create llms-full.txt (optional). For sites with a lot of content, create a long version (llms-full.txt) with a detailed description of each section and page. The main llms.txt will point to this file.
Step 6: Reference in robots.txt. Add a line in your robots.txt to help crawlers find your llms.txt: # LLMs info: /llms.txt

Advanced techniques
Audience segmentation
If your site targets multiple audiences (e.g. B2B and B2C clients), structure your llms.txt with distinct sections. This helps AI direct the right content to the right query.
Automated updates
If you publish content regularly (blog), automate the update of llms.txt. Every new blog post should be added automatically. Most CMSs allow dynamic generation of llms.txt, just as they already do for sitemaps.
Complementarity with Schema.org
llms.txt and Schema.org are complementary, not redundant. llms.txt provides a macro overview of the site. Schema.org provides the micro detail of each page. Both together maximise your AI readability. Combine them with a complete GEO strategy.
Multi-language
If your site is multilingual, create one llms.txt per language: llms.txt (primary), llms-fr.txt, llms-en.txt. Indicate the alternative versions in each file.
Need help with your llms.txt?
We configure and optimise your llms.txt file as part of our AI visibility audits.
FAQ
Is the llms.txt file an official standard?
Not in the W3C or IETF sense. It is a community specification proposed by Jeremy Howard and adopted de facto by the SEO community and AI crawlers. Its adoption is growing strongly and it is recognised by the main AI bots (GPTBot, PerplexityBot).
Does llms.txt replace robots.txt?
No. robots.txt tells bots what they should NOT crawl. llms.txt tells AI bots what they SHOULD crawl as a priority and how to understand your site. The two are complementary and should coexist.
Which AI crawlers read llms.txt?
In 2026, GPTBot and OAI-SearchBot (OpenAI), PerplexityBot, and Google-Extended consult the llms.txt file when it is present. Other emerging AI bots are also adopting it. The file is read as a standard Markdown document, so any crawler can interpret it.
What is the difference between llms.txt and an XML sitemap?
The XML sitemap lists all your site URLs with technical metadata (modification date, update frequency). The llms.txt provides semantic context: site description, content categories, and priority pages with descriptive titles. The sitemap is for search engines, the llms.txt is for AIs.
How many pages should be listed in llms.txt?
Ideally, 10 to 50 pages in the main file. For sites with more content, use llms-full.txt for the complete list. The main llms.txt should remain concise and only list the most strategic pages.
Does llms.txt have a measurable impact on AI citations?
Yes. The Ryte study (Munich, 2025) showed an average increase of 18% in AI citations in the 8 weeks following the implementation of a well-configured llms.txt. At AI SOS, we observe similar results. It is one of the AI visibility optimisations with the best effort-to-impact ratio.
Conclusion
The llms.txt file is one of the most effective quick wins for AI visibility. In 30 minutes of implementation, you give AI crawlers a clear map of your site and your expertise. And since fewer than 5% of sites have done it, you immediately get ahead.
It is the kind of optimisation that seems trivial but makes the difference. Like adding a sitemap.xml in 2005 or switching to HTTPS in 2015. Simple, but those who do it first benefit the most.
The rules have changed. Speak the language of AIs.