The 16-Probe Scan is a free diagnostic tool that checks a business website across 16 specific signals used by AI search systems. It scores each signal as present or absent, returns an overall visibility score out of 16, and shows exactly which gaps are costing you recommendations from tools like ChatGPT, Perplexity, and Claude.
Why a scan, not just advice
Think of it like a building survey before you buy a property. You could walk around and form a general impression. You could read articles about what makes a building structurally sound. Or you could have someone check the actual roof, wiring, damp course, and foundations in a systematic way and tell you precisely what needs attention.
The 16-Probe Scan does the latter. It does not produce a general impression. It checks 16 specific things, one by one, and tells you which ones are present and which ones are not.
Most businesses discover that the signals AI systems rely on most are entirely absent from their site. That is not a criticism. It is simply a result of those signals not existing until recently.
What the scan checks
The 16 probes map directly to the 5-Layer Framework, beknown.world's model of AI visibility. Each layer has a set of probes associated with it.
Layer 1: Crawlability
AI systems cannot recommend what they cannot read. Before structured data, entity signals, or anything else matters, the content on your site needs to be accessible to automated systems.
The crawlability probes check whether your site is technically open to AI crawlers, whether your robots.txt file is blocking the wrong things, and whether your content is rendered in a way that automated systems can parse. A surprising number of sites inadvertently tell AI crawlers to stay out, often as an unintended side effect of other settings.
The relevant pillar page for this layer is /aeo/crawlability.
Layer 2: Structured Data
Structured data is a set of labels you attach to your content. Think of it as the difference between handing someone a business card and handing them a sheet of paper with your name written somewhere in the middle of a paragraph. The business card communicates what each piece of information is. The paragraph does not.
The structured data probes check whether your site includes schema markup, which types are present, whether they are valid, and whether they describe the things AI systems most commonly need to know about a business: what it does, where it operates, how to contact it, and what people say about it.
The relevant pillar page is /aeo/structured-data.
Layer 3: llms.txt
The llms.txt file is a relatively new signal. It is a plain-text file that sits at the root of your website and tells AI systems, in plain language, what your business does, who it serves, and what the key pages on your site are for.
Most businesses do not have one. The probes in this layer check whether the file exists, whether it is correctly formatted, and whether it contains the information AI systems use when deciding how to summarise a business.
The relevant pillar page is /aeo/llms-txt.
Layer 4: Entity Signals
An entity, in the way AI systems think about it, is a thing that exists independently of any single web page. Your business is an entity. Your business name, your address, your description, your category, all of these should exist consistently across multiple sources so that AI systems can be confident they are referring to the right thing.
The entity signal probes check whether your business information is consistent across the sources AI systems consult, whether you appear in business registries and knowledge graphs, and whether the signals used to identify your business point to the same place.
The relevant pillar page is /aeo/entity-signals.
Layer 5: WebMCP
WebMCP is the protocol that allows AI agents to interact directly with a business's systems: booking, enquiry, purchasing, or information retrieval. Think of it as leaving the door open rather than just putting a sign in the window.
The WebMCP probes check whether your site exposes any MCP-compatible endpoints, and whether AI agents could complete a task on behalf of a user without redirecting them elsewhere.
The relevant pillar page is /aeo/webmcp.
How the scoring works
Each probe returns a binary result. Either the signal is present or it is not. There is no partial credit, because AI systems do not award partial credit either. A structured data schema that is present but malformed is treated the same as one that does not exist.
The overall score is the count of probes that returned a positive result, expressed out of 16.
Typical first-scan scores look like this:
- 0 to 4: No meaningful AI visibility. The business is effectively invisible to AI recommendation systems.
- 5 to 8: Some baseline signals present, usually crawlability and basic structured data, but the signals that most influence AI recommendations are missing.
- 9 to 12: A reasonable foundation. Several layers are addressed. Gaps remain, usually in entity signals and WebMCP.
- 13 to 16: Strong AI visibility. This is where most businesses want to be, and where very few currently are.
Most businesses who run the scan for the first time score between 3 and 7. That is not because their sites are poorly built. It is because the signals AI systems use are different from the signals traditional search engines use, and most websites were built with the latter in mind.
What the scan is not
The 16-Probe Scan is not an SEO audit. It does not measure page speed, keyword rankings, backlink profiles, or domain authority. Those things matter for traditional search. They are largely irrelevant to AI recommendation systems.
It is also not a content audit. It does not evaluate whether your copy is well-written, whether your value proposition is clear, or whether your pricing page converts. Those are real concerns. They are just not what the scan measures.
The scan measures one thing: whether the technical and structural signals that AI systems use to identify, understand, and recommend businesses are present on your site.
How to use the results
The scan report lists each probe, its result, and a plain-English explanation of what a failed probe means for your visibility.
Some probes can be addressed quickly. Adding an llms.txt file, for instance, is a one-hour task for anyone comfortable editing website files. Others, like building entity signals across multiple external sources or implementing WebMCP endpoints, take longer and require more systematic work.
The scan does not tell you what order to prioritise. That depends on your business, your current position, and where your competitors are. But it gives you a complete picture of where you stand, which is the only useful starting point.
If you want to understand your current position, run the free 16-Probe Scan on your site. The results take less than two minutes.
FAQ
What is the 16-Probe Scan?
The 16-Probe Scan is a free diagnostic tool from beknown.world that checks a business website across 16 specific signals used by AI search systems. It returns a score and a breakdown showing which signals are present, which are missing, and what that means for how AI tools like ChatGPT and Perplexity find and recommend your business.
How long does the 16-Probe Scan take?
The scan runs in under two minutes. You enter your website URL and receive a scored report across all 16 probes. No account is required to run the scan. The results are available immediately.
What does the 16-Probe Scan actually test?
The scan tests signals across all five layers of the 5-Layer Framework: crawlability, structured data, llms.txt, entity signals, and WebMCP. Each probe checks for a specific condition, either present or absent, and the combination of results produces your overall visibility score.
What score should I be aiming for?
Most businesses score between 3 and 7 out of 16 on their first scan. A score above 12 indicates strong AI visibility. The majority of businesses have not yet implemented the signals that matter to AI systems, so even a moderate score improvement delivers a meaningful competitive advantage.
Is the 16-Probe Scan the same as an SEO audit?
No. An SEO audit tests signals that influence traditional search rankings: page speed, backlinks, keyword density. The 16-Probe Scan tests entirely different signals, the ones AI systems use to decide whether to recommend a business. There is some overlap in crawlability, but the two assessments are largely distinct.
What happens after I run the scan?
You receive a scored breakdown of all 16 probes. The report shows which signals are present, which are missing, and which layer of the 5-Layer Framework each probe belongs to. From there, you can address the gaps yourself or ask beknown.world to implement them for you.
Do I need to be technical to understand the results?
No. The scan report is written in plain English. Each probe result is explained in terms of what it means for your visibility, not in terms of code or configuration. The score itself is self-explanatory: the higher it is, the more likely AI systems are to find and recommend your business.