The 5-Layer Framework is the delivery model beknown.world uses on every engagement. It covers five specific technical signals: Crawlability, Structured Data, llms.txt, Entity Signals, and WebMCP. AI search tools use these signals, in combination, to decide which businesses to surface when a user asks for a recommendation. Most businesses have none of them in place. We build all five.
Why one layer is never enough
Think of a new restaurant. It has a sign above the door, a listing on a maps app, a review on a food site, a menu in the window, and a phone number in a directory. Remove any one of those and some customers cannot find it. Remove all but one and most customers cannot find it. AI systems work the same way. They pull from structured sources, crawl page content, read protocol files, consult knowledge registries, and now interact with live systems. A business that has only done one of those things is invisible to everything else.
The five layers are not a ranked list. They are not optional add-ons you build towards. They are a complete set. We treat them that way on every delivery.
The five layers, explained plainly
Layer 1: Crawlability
Before an AI system can tell anyone about your business, it needs to be able to read your website without obstruction. This sounds obvious. In practice, a large proportion of business websites block AI crawlers by accident, through misconfigured robots.txt files, JavaScript-heavy pages that render no readable content, or technical errors that return the wrong status codes.
Crawlability is the foundation. If an AI tool cannot read your site, nothing built on top of it matters.
We audit your crawl configuration, correct anything that prevents AI agents from accessing your content, and confirm that the pages which matter most are fully readable. This layer does not make you visible on its own. It makes everything else possible.
Full detail on this layer lives at /aeo/crawlability.
Layer 2: Structured Data
Your website has words on it. Those words describe your business. But AI systems do not read prose the way a person does. They look for structured signals: formatted data that explicitly states what your business is, what it offers, where it operates, and how it relates to other entities.
Structured data is the translation layer. It takes what you already say and reformats it into a language AI systems are built to process.
We implement schema markup across your key pages. This covers your business type, your services or products, your contact details, your reviews, your opening hours, and any other attributes relevant to how AI tools categorise and recommend you. When a user asks an AI tool for a recommendation, structured data is often the difference between being mentioned and being invisible.
Full detail on this layer lives at /aeo/structured-data.
Layer 3: llms.txt
This is a plain-text file that sits at the root of your website and tells AI language models exactly what your business does, who it serves, what your key pages are, and how to interpret your content. It is the equivalent of handing someone a brief before they read your website, except the someone is a large language model deciding whether to recommend you.
Most businesses do not have one. The ones that do have a significant advantage in how accurately and completely AI tools describe them.
We write and implement your llms.txt file based on your actual business, not a generic template. The content is specific, structured, and designed to align with how AI systems extract and summarise information.
Full detail on this layer lives at /aeo/llms-txt.
Layer 4: Entity Signals
AI systems do not just read your website. They cross-reference what your website says against what the wider web says about you. If your business name, description, category, and contact details appear consistently across authoritative third-party sources, AI tools treat you as a confirmed, trustworthy entity. If those details are inconsistent, missing, or contradicted, AI tools become uncertain about you and that uncertainty usually means they mention someone else instead.
Entity signals are the reputation layer. They tell AI systems that you are real, that you are what you say you are, and that multiple independent sources agree.
We submit your business to the registries, directories, and structured data sources that AI systems treat as authoritative. We also audit your existing presence for inconsistencies and correct them. This layer takes longer to compound than the others, because it depends partly on third-party sources updating their records. But it is the layer that produces the most durable visibility over time.
Full detail on this layer lives at /aeo/entity-signals.
Layer 5: WebMCP
WebMCP is a protocol that allows AI agents to interact directly with your business systems. Not just read your website. Actually take actions: book appointments, query availability, retrieve product information, submit enquiries.
This matters now because AI tools are moving from answering questions to taking actions on behalf of users. A user who asks an AI assistant to find and book a service provider will be connected to a business that has implemented WebMCP. A business that has not will simply not be in the selection.
This is the newest of the five layers and the one most businesses have not encountered yet. We implement WebMCP endpoints that make your business accessible to AI agents in the way those agents are built to work.
Full detail on this layer lives at /aeo/webmcp.
How the layers work together
Each layer does something different. Together, they do something a single layer cannot: they make your business consistently findable, accurately described, and reliably recommended across every AI tool that a potential customer might use.
A business with structured data but no crawlability has formatted signals that never get read. A business with llms.txt but no entity signals has told AI tools what it is, but given them no external corroboration. A business with all four foundational layers but no WebMCP will be recommended but not acted upon as AI tools become more capable.
The 5-Layer Framework is comprehensive because partial implementation produces partial results. We have seen this clearly in our own delivery work. The businesses that see consistent AI referral traffic are the ones where all five layers are in place.
What the framework does not cover
The 5-Layer Framework is a technical delivery model. It builds the signals. It does not write your content strategy, run your social media, or manage your paid advertising.
It also does not guarantee placement in any specific AI tool's responses. No one can promise that. What we can build is a complete, correct, and well-structured signal set that gives AI tools no reason to overlook you and every reason to mention you when the question is relevant.
How we measure results
Every delivery is assessed using the 16-Probe Scan, which scores your site across all five layers using 16 specific checks. We run it before we start work to establish a baseline and after implementation to confirm what has changed.
Ongoing results are reported using the Four Numbers: your isitagentready level, your 16-Probe Scan score, your Citation Rubric score, and your AI referral traffic delta. These four figures tell you whether the framework is working and where further improvement is available.
Where to start
The free visibility check on beknown.world runs the 16-Probe Scan against your site and shows you which of the five layers are in place, which are missing, and what that means for how AI tools currently see your business. It takes two minutes and requires no technical knowledge to interpret.
Most businesses find at least three of the five layers are absent. That is not a criticism. It is simply where most businesses are right now. The ones who act on it first are the ones AI tools will recommend.
Frequently Asked Questions
What is the 5-Layer Framework?
The 5-Layer Framework is beknown.world's model for building AI visibility. It covers five specific signals that AI search tools use when deciding which businesses to recommend: Crawlability, Structured Data, llms.txt, Entity Signals, and WebMCP. Every delivery we do builds all five layers.
Why are there five layers and not just one?
Because AI systems gather information from multiple sources and in multiple formats. A business that only optimises one signal will still be invisible to tools that rely on the others. The five layers cover all the ways a modern AI search tool identifies, reads, and trusts a business.
Do I need all five layers?
Yes. The layers are interdependent. Structured data without crawlability means the data never gets read. Entity signals without structured data means AI tools cannot confirm what the entity actually does. Partial implementation produces partial results. We build all five because that is what produces consistent AI referrals.
How long does it take to implement the 5-Layer Framework?
Most implementations complete within two to four weeks from the point we have access to the site and a completed brief. Some layers, particularly entity signals and registry submissions, continue to compound over time as third-party sources pick up the signals we build.
How does the 5-Layer Framework relate to traditional SEO?
Traditional SEO optimises pages for keyword rankings in link-based search engines. The 5-Layer Framework optimises a business for recommendation by AI tools. The two are not in conflict, and some signals overlap, but the intent and audience are different. AI systems do not rank pages. They decide who to mention.
How do I know which layers my business is currently missing?
The 16-Probe Scan scores your site across all five layers using 16 specific checks. It shows exactly where you are visible, where you are absent, and what that absence is likely costing you in AI referrals. You can run it free at beknown.world.
What is WebMCP and why is it a layer?
WebMCP is a protocol that allows AI agents to interact directly with your business systems, not just read your website. As AI tools become more capable of taking actions on behalf of users, businesses that have implemented WebMCP become the ones those agents can actually work with. It is the newest layer and the one most businesses have not heard of yet.