Most Brands Are Invisible to AI. And They Don't Know It.
We ran an AI visibility audit on a B2B software company last month. They rank on page one of Google for their primary keyword. Solid domain authority. Good backlinks. Their SEO team was happy.
Then we asked ChatGPT, Gemini, and Perplexity to recommend tools in their category. Not mentioned once. Not in the top five. Not as an alternative. Not even in a footnote.
That's what an AI visibility audit is for. It tells you whether AI platforms know you exist, and if they don't, it shows you why.
What Is an AI Visibility Audit?
An AI visibility audit checks how your brand appears (or doesn't) when people ask AI assistants for recommendations, comparisons, and answers in your space.
It's not the same as an SEO audit. SEO audits measure how Google's search index sees your site: crawlability, keywords, backlinks. An AI visibility audit measures something different. Do large language models like GPT-4, Gemini, and Claude actually know about your brand? Do they recommend it in context?
The distinction matters because AI platforms don't work like search engines. They don't crawl a list of results and rank them. They pull from training data, retrieval-augmented sources, and web access, then assemble an answer. Your brand either makes it into that answer or it doesn't. There's no page two. You're mentioned or you're invisible.
An audit typically covers three things:
- Brand mentions. Does the AI name your company when asked about your category?
- Accuracy. When it does mention you, is the information correct and current?
- Positioning. Are you recommended first, listed among others, or only mentioned when someone asks about you directly?
Why This Matters Now
The chatbot market hit $7 billion in 2024 and is projected to reach $20 billion by 2029. More people are skipping Google entirely and asking ChatGPT or Perplexity directly. When someone asks "what's the best project management tool for remote teams?" and the AI lists five options, those five companies get the click. Everyone else gets nothing.
This isn't theoretical. We see it in our audits constantly: companies with strong traditional SEO that are completely absent from AI recommendations. And the reverse happens too. Smaller brands with clear, well-structured content show up consistently in AI answers despite modest search rankings.
The shift is happening fast. If you wait until AI search is the dominant channel to start optimizing for it, you'll be playing catch-up against competitors who started earlier.
How to Run an AI Visibility Audit
You don't need expensive tools to start. Here's the process we use, broken into four steps.
Step 1: Build Your Prompt List
Write 15-20 prompts that a potential customer might type into an AI assistant. Mix these categories:
- Direct category queries. "What are the best [your category] tools?"
- Comparison queries. "Compare [your brand] vs [competitor]"
- Problem queries. "How do I solve [problem your product addresses]?"
- Recommendation queries. "What [your category] do you recommend for [specific use case]?"
Be specific. "Best CRM" is too broad. "Best CRM for B2B companies with under 50 employees" is what real people actually ask.
Step 2: Test Across Multiple AI Platforms
Run every prompt through at least three platforms:
- ChatGPT (GPT-4), the largest user base
- Google Gemini, integrated into Google Search via AI Overviews
- Perplexity, growing fast, cites sources explicitly
For each response, record:
- Were you mentioned? (yes/no)
- What position? (first recommendation, listed among others, only when asked directly)
- Was the information accurate?
- Which competitors were mentioned instead?
Do this in a spreadsheet. It sounds tedious, but the patterns that emerge are revealing. You'll quickly see which platforms know about you and which don't. And which competitors consistently appear where you don't.
Step 3: Check Your Technical Foundations
AI platforms pull information from your website, but only if they can access it and parse it correctly. Check these:
Robots.txt. Are you blocking AI crawlers? Many sites accidentally block GPTBot, Google-Extended, or ClaudeBot. Check your robots.txt file:
# Good - allows AI crawlers
User-agent: GPTBot
Allow: /
User-agent: Google-Extended
Allow: /
User-agent: ClaudeBot
Allow: /
# Bad - blocks them
User-agent: GPTBot
Disallow: /
Structured data. JSON-LD helps AI platforms understand what your business does, what you offer, and how you relate to your category. At minimum, add Organization and Product/Service schema:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Organization",
"name": "Your Company",
"description": "One sentence about what you do",
"url": "https://yoursite.com",
"sameAs": [
"https://linkedin.com/company/yourcompany",
"https://twitter.com/yourcompany"
]
}
</script>
Content clarity. AI models favor content that states things clearly and directly. If your homepage says "We empower organizations to unlock synergies" instead of "We make project management software for remote teams," the AI will struggle to categorize and recommend you.
Step 4: Analyze the Gaps
After collecting your data, look for these patterns:
- Platform gaps. Maybe ChatGPT knows you but Gemini doesn't. That tells you where to focus.
- Category gaps. You show up for "best [tool]" but not for "how to solve [problem]." That's a content gap you can fill.
- Accuracy issues. The AI mentions you but gets your pricing or features wrong. That's a structured data problem.
- Competitor dominance. One competitor appears in every response. Study what they're doing differently. Often it's clearer content structure or better schema markup.
What We Actually Find in Audits
After running dozens of AI visibility audits, the same issues come up again and again.
The "good SEO, invisible to AI" problem. About 60% of the companies we audit have decent search rankings but poor AI visibility. The most common reason? Their content is optimized for keywords but doesn't clearly state what they do or who they serve. SEO content often dances around the topic with variations and long-tail phrases. AI models want direct, clear statements.
Blocked crawlers. Roughly one in three sites we audit blocks at least one major AI crawler in robots.txt, usually without knowing it. Some CMS platforms and security plugins add these blocks by default.
Missing structured data. Most sites have basic schema (maybe an Organization type), but few have Product, Service, or FAQ schema that helps AI models understand what they actually offer. The sites that show up consistently in AI recommendations almost always have thorough structured data.
The competitor you didn't expect. In nearly every audit, there's a competitor the client didn't consider. Often a smaller company with excellent content structure that dominates AI recommendations. Traditional market share doesn't predict AI visibility.
Inconsistent across platforms. A brand might appear in ChatGPT's top three recommendations but be completely absent from Gemini and Perplexity. Each model has different training data, different retrieval methods, and different recency. Optimizing for one doesn't guarantee visibility in another. That's why testing across all three matters.
FAQ pages get cited a lot. AI models love well-structured FAQ content. Companies with detailed FAQ pages, especially ones using FAQ schema markup, get cited far more often than those without. If you have a FAQ section buried three clicks deep on your site, move it up. If you don't have one, build it.
What Doesn't Work (Yet)
Some things the industry promotes that we haven't seen move the needle:
llms.txt files. The idea is sound: a machine-readable summary of your site for AI crawlers. In practice, there's no evidence that any major AI platform currently reads or uses llms.txt files. It doesn't hurt to have one, but don't expect it to change your AI visibility overnight.
Keyword stuffing for AI. Some agencies recommend cramming "as recommended by AI" or "top-rated by ChatGPT" into your content. This is as ineffective as it sounds. AI models don't work that way.
One-time audits. AI visibility isn't static. Models get updated, training data changes, and competitors improve. A single audit gives you a snapshot. You need to recheck quarterly at minimum.
What To Do About It
If you've never checked your AI visibility, start here:
- Run a quick scan. Use a tool like AIReadyCheck to get a baseline in 60 seconds. It checks your structured data, robots.txt, content clarity, and more.
- Do the manual prompt test. Pick your five most important keywords, run them through ChatGPT, Gemini, and Perplexity, and see where you stand.
- Fix the technical basics. Unblock AI crawlers in robots.txt. Add Organization and Product/Service JSON-LD. Make sure your homepage clearly states what you do in plain language.
- Rewrite for clarity. Take your top landing pages and ask: "If an AI read this, would it know what we sell and who we sell it to?" If the answer is no, rewrite the first two paragraphs.
- Set a quarterly check. AI visibility changes. Put a recurring reminder to re-run your audit every three months.
The brands that show up in AI recommendations in 2026 will be the ones that started paying attention in 2025. The technical bar isn't high. It's mostly about clarity, structure, and not accidentally blocking the crawlers. But you have to actually check.
Run the free AI visibility scan to check your site in 60 seconds.