Mmakeaifriendly
Back to blog

llms txt·2 min read

What is llms.txt and why your website needs one

A new file standard that helps AI agents understand your website.

You know robots.txt and sitemap.xml. There's a new file designed not for search engines, but for AI agents like ChatGPT, Perplexity, and Claude.

It's called llms.txt, and it's the single highest-impact file you can add to your site for AI visibility.

What it is

llms.txt is a plain-text file at your site root (e.g. https://example.com/llms.txt) that gives AI a machine-readable summary of your site. Think of it as a README for AI agents.

While robots.txt tells crawlers what they can access, llms.txt tells them what your site is - what it does, what pages exist, and what matters most.

What goes inside

A good llms.txt includes:

  • A one-line description of your site
  • What the site does - key features, services, or content
  • A list of important pages with links
  • Company or author info

Example:

# My SaaS Product

> A project management tool for remote teams.

## Features

- Real-time collaboration on tasks and documents
- Time tracking and reporting
- Integrations with Slack, GitLab, and Jira

## Pages

- Home: /
- Pricing: /pricing
- Docs: /docs
- Blog: /blog

## Company

Built by Acme Inc. - https://acme.com

How to add it

  1. Create a file called llms.txt in your site's public folder (or web root)
  2. Write a clear, structured summary using the format above
  3. Deploy - the file should be accessible at https://yoursite.com/llms.txt

No build step, no config, no dependencies.

llms-full.txt

For deeper AI consumption, also create llms-full.txt - a comprehensive version with the full content of every key page. This lets AI deeply understand your business without visiting each page individually.

In our AI Readiness Audit, having both files earns the full 30 points. llms.txt alone earns 25.

Verify it's working

Visit your URL directly to confirm the file is accessible. Then run a free AI Readiness Audit to see how your site scores across all six AI visibility signals.

What else to do

llms.txt is one piece of the puzzle. For maximum AI visibility:

  • Allow AI crawlers in your robots.txt (guide)
  • Add JSON-LD structured data (guide)
  • Write clear meta descriptions for each page
  • Maintain a valid sitemap.xml

The sites that add these signals first will capture the most AI-driven traffic.

Check your site's AI readiness

Run a free audit to see how visible your website is to AI agents. Takes about 5 seconds.