Mmakeaifriendly
Back to blog

ai visibility·3 min read

The 6 signals AI checks before recommending your site

The six technical signals AI checks before referencing your site.

When you ask ChatGPT for a recommendation, it doesn't pick randomly. It evaluates which sites it can read, understand, and trust - then references the ones that make it easiest. Here are the six signals that matter most, scored by our AI Readiness Audit on a scale of 0 to 100.

1. llms.txt (up to 30 points)

A plain-text file at your site root that introduces your website to AI agents. It describes what you do, what pages exist, and what matters most.

This is the highest-impact signal. Without it, AI agents have no structured summary of your business. They have to guess - and they usually skip you.

Create a markdown file at yoursite.com/llms.txt with your business name, a one-line description, key services, and links to important pages. For deeper AI consumption, add llms-full.txt with full page content. Both files together earn the full 30 points.

2. robots.txt (up to 25 points)

Controls which bots can access your site. Many websites unknowingly block AI crawlers like GPTBot, ClaudeBot, and PerplexityBot through overly broad rules or security plugins.

If these bots are blocked, AI can't read your site at all - regardless of how good your content is. Check yoursite.com/robots.txt and make sure there's no Disallow: / rule targeting AI crawlers.

3. Sitemap (up to 15 points)

An XML file listing all your pages. Without one, AI crawlers can only find pages by following links from your homepage - they'll miss deeper content.

Most CMS platforms generate sitemaps automatically. Verify yours exists at yoursite.com/sitemap.xml and reference it in your robots.txt with a Sitemap: directive.

4. Structured data (up to 15 points)

JSON-LD markup in your page's <head> that gives AI typed information about your business - your name, type, location, products, contact details.

Without structured data, AI has to infer all of this from unstructured text. With it, AI can confidently classify your business, compare you to competitors, and generate accurate answers about what you offer.

Use the most specific Schema.org type that fits: LocalBusiness for physical locations, SoftwareApplication for tools, ProfessionalService for consultants.

5. Meta description (up to 10 points)

The HTML tag that summarizes a page's content. AI agents use it to quickly understand what a page is about before deciding whether to reference it.

Write a unique description for every important page. Keep it under 160 characters and be specific about what the page offers - not generic marketing copy.

6. Open Graph tags (up to 5 points)

Meta tags that control how your site appears when shared or referenced. They provide a title, description, and image that AI can use when citing your site.

The least critical signal, but still contributes to a complete, professional presence. Most CMS platforms and SEO plugins handle this automatically.

How most websites score

Most sites score poorly. The most common gaps: no llms.txt (almost nobody has one yet), AI crawlers blocked by default, and no structured data.

The good news - these are all fixable in under an hour. Run a free AI Readiness Audit to see your score across all six signals.

Check your site's AI readiness

Run a free audit to see how visible your website is to AI agents. Takes about 5 seconds.