Google’s New AI Search Guide Calls AEO And GEO ‘Still SEO’
/ 8 min read
Shalin Siriwardhana's take
My take on "Google’s New AI Search Guide Calls AEO And GEO ‘Still SEO’" is that the real value is in turning the idea into an operating decision. Introduction Google published a new documentation page to help websites optimize for generative AI features in Search, including AI Overviews and AI Mode . The page, "... I would look for the signal behind the tactic: what is weakening trust, what can be measured cleanly, and what action will compound over time.
Whenever a new technology shifts the landscape of the web, a new set of acronyms usually follows. Lately, the industry has been buzzing with terms like AEO (Answer Engine Optimization) and GEO (Generative Engine Optimization). For many of us, this creates a sense of urgency—a feeling that the rules of the game have fundamentally changed and that we need to learn an entirely new discipline to stay visible.
It is easy to get caught up in the hype cycle, especially when consultants start selling specialized "GEO services." However, Google recently released a documentation page that serves as a necessary reality check. The guide, titled "Optimizing your website for generative AI features on Google Search," essentially tells us to take a deep breath.
This isn't just a minor update; it's an expansion of their 2025 AI documentation. While the previous version focused on the "how" of AI Overviews and AI Mode—explaining how content is included and how performance is tracked—this new guide gets into the weeds of optimization. More importantly, it explicitly tells site owners what they can stop worrying about.
Google Says AEO And GEO Are 'Still SEO'
The most striking part of the guide is how Google handles the terminology debate. There is a lot of noise suggesting that generative AI requires a completely different framework than traditional search. Google is very direct in shutting this down.
They define AEO as "answer engine optimization" and GEO as "generative engine optimization," but their conclusion is simple: from the perspective of Google Search, optimizing for generative AI is simply optimizing for the search experience. In other words, it is still SEO.
This isn't a surprising stance if you follow the technical side of how these systems work. Google explains that its AI features are rooted in the same core ranking and quality systems that have always powered Search. They rely on a process called retrieval-augmented generation (RAG) and "query fan-out" to pull content from the existing Search index. Because the AI is drawing from the same index that traditional search uses, the foundational pillars of SEO remain the primary levers for visibility.
This official documentation mirrors what Google representatives, such as Gary Illyes and Cherry Prommawin, have been saying at conferences and on Search Central Live. For a long time, these were just verbal assurances. Now, they are part of the official record, providing a clear reference point for anyone feeling pressured to pivot their entire strategy toward a "generative" framework.
What Google Says You Don't Need To Do
One of the most useful parts of the new guide is the "Mythbusting generative AI search" section. It is refreshing to see a platform explicitly list tactics that site owners should ignore. In an industry where we are often told to do more, being told to do less is a significant win.
First, let's talk about the technical "hacks." There has been a trend toward creating llms.txt files or using special AI-specific markup and Markdown to make content more "readable" for LLMs. Google is clear: you don't need to create machine-readable files or special AI text files to appear in generative AI search. While Google can index various file types beyond standard HTML, doing so doesn't grant you any special treatment or priority in AI responses.
Then there is the concept of "chunking." Some practitioners suggest breaking content into tiny, bite-sized pieces to make it easier for AI to digest. Google advises against this. Their systems are sophisticated enough to understand the nuance of multiple topics on a single page and can extract the relevant section for the user without the author having to artificially fragment the content. This aligns with previous comments from Danny Sullivan, who noted that Google's engineers specifically recommend against this practice.
The guide also addresses the way we write. You don't need to rewrite your content specifically for AI systems or obsessively capture every single long-tail keyword variation. Modern AI systems are capable of understanding synonyms and general intent; they don't require the rigid, repetitive phrasing that old-school keyword stuffing once demanded.
Finally, Google touches on "inauthentic mentions." While it's true that AI features often surface discussions from forums, blogs, and videos, the guide warns against trying to manufacture these mentions artificially. Trying to "game" the system by creating fake buzz is not a recommended path to sustainable visibility.
What Google Says To Focus On
If we aren't spending time on llms.txt files or content chunking, where should the effort go? Google suggests returning to the basics, but with a specific lens on what they call "non-commodity content."
This is a crucial distinction. "Commodity content" is the kind of information that is common knowledge and can be found on a hundred different sites—for example, a post titled "7 Tips for First-Time Homebuyers." While useful, it doesn't offer anything unique.
"Non-commodity content," on the other hand, provides unique insight, personal experience, or a perspective that can't be replicated by a generic AI prompt. Google gives the example of a post like "Why We Waived the Inspection & Saved Money: A Look Inside the Sewer Line." The difference is the presence of firsthand evidence and unique storytelling. In an era where AI can generate "7 tips" in seconds, the only way to remain valuable is to provide the kind of insight that only a human with real-world experience can offer.
On the technical side, the advice is straightforward: ensure your pages are indexed and eligible for snippets. If a page can't be a featured snippet, it's unlikely to appear in an AI Overview. This means focusing on:
- Following standard crawling best practices.
- Using semantic HTML to give the page clear structure.
- Maintaining a clean JavaScript SEO implementation.
- Prioritizing a good page experience (speed, usability).
- Reducing duplicate content across the site.
For those in the local or e-commerce space, the focus shifts toward structured data and official feeds. Google recommends utilizing Merchant Center feeds and keeping Google Business Profiles updated. They also mention the "Business Agent," a conversational tool that allows customers to interact with brands directly within Search, suggesting that the bridge between "searching" and "interacting" is narrowing.
Agentic Experiences Get Initial Guidance
Perhaps the most forward-looking part of the guide is the section on "agentic experiences." This moves beyond simple search results into the realm of AI agents—autonomous systems that can actually perform tasks, such as booking a hotel reservation or comparing complex product specifications across different sites.
Google explains that these browser agents don't just "read" text; they may analyze screenshots, inspect the DOM (Document Object Model), and interpret the accessibility tree to understand how a page works. This is a subtle hint that accessibility isn't just about inclusivity—it's about making your site navigable for the next generation of AI agents.
The guide points toward the web.dev resources for agent-friendly best practices and introduces the Universal Commerce Protocol (UCP). Co-developed with partners like Shopify, UCP is an emerging protocol designed to allow Search agents to perform more complex actions. While this feels like the "future," it's the first time Google has explicitly linked the technical structure of a website to the ability of an AI agent to execute a transaction.
Why This Matters
This documentation is important because it consolidates a fragmented conversation. For the last year, we've had to piece together Google's stance on AI from podcast interviews, conference slides, and vague blog posts. Having a single, official reference point changes the conversation.
The "mythbusting" section is particularly heavy-hitting. By explicitly naming tactics like llms.txt and GEO-specific services as unnecessary, Google is essentially warning site owners not to waste their budgets on "AI optimization" packages that promise secret shortcuts to the top of the AI Overview.
However, a word of caution: this guidance applies specifically to Google. It does not necessarily apply to other AI platforms like Perplexity or ChatGPT. Those systems may use different ranking signals or weight certain markers differently. But since Google still commands the lion's share of search traffic, their guidance should be the primary baseline for most strategies.
Looking Ahead
The most comforting part of the guide is the closing sentiment. Google admits that you don't need to check every single box in the document to succeed. They explicitly state that plenty of content thrives in Search—including within generative AI experiences—without any overt SEO effort at all.
This suggests that the "magic" isn't in the technical optimization, but in the quality of the information itself. The guidance on agentic experiences is framed as something to explore "if you have extra time," implying that while agents are the future, they aren't an urgent crisis for the present.
The takeaway is clear: stop chasing the new acronyms. Focus on creating content that provides genuine, non-commodity value, keep your technical foundation clean, and ignore the noise. The "new" world of AI search is, surprisingly, built on the same foundations as the old one.