AEO: the 5 layers of AI search visibility
The five layers of AI search visibility, broken down. Crawlability, structured data, llms.txt, entity signals, and WebMCP. Each one is a separate piece of work and a separate signal AI systems use to decide who to recommend.
Crawlability for AI Search: How AI Bots Actually Find Your Website
Crawlability for AI search is the foundation of AI visibility. If AI systems cannot read your website, nothing else matters. Here is how it works and what to fix.
Entity signals: how AI systems decide whether your business exists
Entity signals are the structured, consistent facts about your business that AI search tools use to confirm you are a real, trustworthy organisation worth recommending. Without them, you do not exist to those systems.
llms.txt: the plain-text site index AI agents read first
llms.txt is a plain-text file that tells AI systems what your website contains and where to find the most important content. Without it, AI agents have to guess. Most businesses have not added one yet.
Structured Data for AI: Telling AI Systems What Your Business Is
Structured data for AI search is the clearest signal you can give AI systems about what your business does, who it serves, and why it should be recommended. Most businesses have not done this yet.
WebMCP: How to Make Your Website Operable by AI Agents
WebMCP lets AI agents interact with your website the way a human visitor would: reading content, completing actions, and retrieving information on behalf of users. Here is what it means for your business.