We ran the AI Visibility Checker on 50 well-known websites across five industries — ecommerce, SaaS, media, agencies, and local businesses. The goal was simple: find out how many are actually visible to AI assistants like ChatGPT, Claude, and Gemini.
The results were striking. Most websites are functionally invisible to AI.
The Numbers
Across all 50 sites, the average AI Visibility Score was 38 out of 100 — firmly in the "Weak" range. Only 6 out of 50 sites scored above 70 (Good). Nearly half scored below 35.
Scores by Industry
- SaaS companies: Average score 52/100 — the strongest category, but still in "Moderate" territory. Most SaaS sites had decent structured data and clean markup, but very few had an
llms.txtfile or explicitly allowed AI crawlers. - Media & publishers: Average score 45/100 — surprisingly low for content-heavy sites. Many had strong semantic coverage but poor signal-to-noise ratios due to ad-heavy layouts.
- Digital agencies: Average score 38/100 — ironic, given that these are the companies selling SEO services. Most lacked structured data and had no AI-specific crawler policies.
- Ecommerce sites: Average score 34/100 — product pages often had schema.org markup for products, but landing pages and category pages were semantic wastelands.
- Local businesses: Average score 24/100 — the worst performers. Most had minimal content, no structured data, and boilerplate templates with terrible signal-to-noise ratios.
The 3 Most Common Failures
1. Blocking AI Crawlers (72% of sites)
Almost three-quarters of the sites we tested either had no robots.txt policy for AI crawlers,
or actively blocked them. The most commonly blocked bots were GPTBot (OpenAI) and CCBot (Common Crawl, used by many LLM training pipelines).
Many webmasters don't realize that blocking these crawlers doesn't just affect training data — it also prevents AI assistants from accessing your site during real-time retrieval (RAG) queries.
2. No llms.txt File (94% of sites)
Only 3 out of 50 sites had an llms.txt file. This is the single easiest win for AI visibility:
a plain-text file at your site root that tells LLMs what your site does, what it offers, and what matters.
It takes 10 minutes to create, yet almost nobody has done it.
3. Low Citation Likelihood (68% of sites)
Two-thirds of sites lacked clear, quotable statements that an LLM could confidently cite. Instead of crisp definitions and authoritative claims, most pages were filled with vague marketing copy: "We're passionate about delivering best-in-class solutions" tells an AI nothing it can cite.
What the Winners Did Right
The 6 sites that scored above 70 shared several common traits:
- Explicit AI crawler access in
robots.txt— not just "allow all" but specific rules for GPTBot, ClaudeBot, and others. - Rich JSON-LD structured data — not just basic Organization schema, but detailed product/service/FAQ markup.
- Content-first pages — high signal-to-noise ratio with minimal navigation clutter and no interstitial popups.
- Clear definitions and statements — opening paragraphs that answer "What is [X]?" directly, in a format LLMs can quote.
- Entity linking — author bios linked to LinkedIn profiles, company mentions linked to authoritative sources.
The Gap Is an Opportunity
Here's what makes this data actionable: the bar is still incredibly low. With most competitors scoring in the 30s, even basic AI visibility optimizations can put you ahead of your entire industry.
Adding an llms.txt file, updating your robots.txt, and restructuring your content with clear definitions
can move your score from "invisible" to "competitive" in a single afternoon.
How Does Your Site Compare?
We've shown you the landscape. Now it's your turn. Run a free AI Visibility Check on your own site and see exactly where you stand compared to the 50 sites we analyzed.
The sites that optimize for AI visibility now will have a massive first-mover advantage. The question is: will you be one of them?