Industry wants us to believe everything is about AEO and GEO now, that the "old" SEO game is dead and we need to completely readapt. But, the truth is that anyone doing serious SEO in the last two years was already optimizing for the same things we need to optimize for today. Sure, some things changed, but serious SEOs already had these in their scope: measuring zero-click traffic, getting actual media coverage (PR-type coverage, not just backlink hunting), adding quotable stats, and using question-based headings with immediate answers. Finally, develop social media and own Google real estate through content repurposing through content repurposing across platforms like social posts, infographics, and Reddit.
1. Create and install an LLMS.txt file. This acts much like a Sitemap.xml file and puts the site content in a format that the LLM's can understand. This can be done with plugins or with an LLMS.txt generator tool. 2. Create a series of FAQ's on an FAQ page and implement the FAQ's Schema code on the page. 3. When creating new content, try typing the title in Google search to test if there are any AI results displayed. If so, copy that content, rewrite at least 30% different and utilise that content in the first paragraph of content. 4. When creating new content, make sure to answer major questions with well researched, authoritative content. We implemented these steps 4 months ago and most of our clients are receiving between 1%-5% increases in referral traffic; more importantly the time on site from these referrals is often multiple minutes!
AI search is having a bit of a magpie moment. Everyone's chasing the shiny stuff. However, I think the smart thing to do is to stick with SEO first principles. Here's how I'm adapting my strategy for AEO: Doubling down on pages already in the top 10. They're far more likely to appear in AI overviews and LLM citations. My focus is to tighten what's already winning. Adding relevant structured data. Based on research and client work, I've found that schema markup reliably improves search visibility. This has become a non-negotiable in my SEO work. Writing for skimmers. Machine learning systems tend to favour content that's easy to parse. So, I keep paragraphs short, use well-optimised headers, and lean on numbered lists.
It was a revelation to me that websites with almost no traffic from traditional search often get significantly more traffic from AI search (ChatGPT, Perplexity, and so on) than sites with high traffic from Google and Bing. For AEO, what matters much more is how your site is mentioned across different sources. And the authority of those sources is less important than it typically is in traditional link building.
In 2025 I've shifted my focus from broad information "what is..." articles to content that sounds like a real person talking. We focus on listicles, comparison pieces, and opinion posts based on actual experience. These bring in much more traffic from ChatGPT, Perplexity, and similar tools than generic guides ever did. Even if your informational article get mentioned in AI tools, it won't matter as user already gets everything they need without clicking the link to your website, while listicles and comparison pieces, they might want to look more into it. For off-site SEO, I treat brand mentions as a key tactic. Whenever we're mentioned on other sites, I push for the same phrase to be used each time, for example "best luxury hotel marketing agencies," so answer engines learn to tie that wording to our brand. This combination is guaranteed to yield results.
Ai is just large data set predicting the next word, In my experience depending totally on chatgpt for creative task isn't ideal chatgpt can suggest you a list of keyowrds but its up to us to select which ones are focused on our niche and intent of the article and search that how difficult it would be to rank on those keywords. Same as writing content with gpt, It can create large blocks of texts which no one will read so we can give him a well written prompt using which it can give us a good starting draft we can polish and check the accuracy of data provided, as per google algorithms they rank content whether its written by ai or human give it follows google rules and provide value to readers rather try to manipulate ranking systems. Chatgpt can also write good and workable meta titles and descriptions but again always proof read, ask for varieties and choose yourself the most suitable one. same goes for page slugs, clinicals tags, site maps and snippets.
We use Azoma to simulate real customer questions and send hundreds of thousands of prompt variations to platforms like ChatGPT and Perplexity to understand what people are actually asking. Based on these insights, we create optimized content (AI Engines particularly like FAQs, tables, and content that addresses frequent customer queries). Secondly, from analysing these hundreds of thousands of prompts, we identify where the AI Engines are drawing it's citations from; usually Reddit, Wikipedia, YouTube, and niche industry-specific publications. Once we know those sources, we focus on getting our brand featured in them through authentic content and traditional PR. Finally, instead of tracking clicks, we measure share of voice - how often our brand appears in AI responses - rather than just clicks. This has driven significant revenue impact, as we have obtained thousands of good leads through our strong ChatGPT visibility, and our data shows traffic from AI Engines converts at 5x higher rates than traditional search engines like Google.
The entire SEO industry is predicted to undergo a radical change in 2026 owing to the fact that AI-based applications like ChatGPT and Perplexity will be accepted and used by a large number of people in their daily work. Tribe Digital will entire body the semantic SEO approach, bringing forth the content that will not only satisfy the user's need but also share it in an entertaining and well-structured manner. Our markup, FAQ, and expert opinion snippets make our content so clear and accessible that even AI spiders can understand it easily. The most amazing experience for us was being able to help our clients to secure the power positions - not only as content creators. What could be the downside of it though? It is a difficult thing to envisage the visibility when clicks are not the main KPI. Hence, we are counting mentions, citations, and the company's presence in AI-generated summaries to make sure our visibility is converted into trust and engagement.
As the world of search rapidly changes, we aim to change with it. Rather than focusing primarily on keyword optimization and word count, we have shifted our efforts to making sure that each page and article is satisfying to the user, answers questions the user might ask about the topic, and organized in a way that search engines can easily understand. Simply adding more FAQs into an article has contributed to many clients getting sales as a result of showing up in AI search results. We also are more focused on building topical and brand authority for clients.
Answer engines reward clear answers, not long lists of links. In 2025 I focus on answerability rather than position. What works for HoverBot is building entity focused pages with a short summary, a plain definition, and step by step instructions. We add structured data for frequently asked questions, how to guides, question and answer pages, and product attributes, and we use section anchors so assistants can cite a single paragraph. We publish small data tables and benchmarks with sources. We keep a living change log and real author biographies to show experience, expertise, authoritativeness, and trustworthiness. To choose topics we run customer interviews, review support tickets, and study Google Search Console queries, then group phrasing into tight themes and write one focused page per theme. What does not work is chasing broad keywords, thin list posts, or link swaps.
The content architecture now focuses on delivering organized answers through each page by using specific sections which can be summarized independently. The current priority includes definition blocks and bulleted takeaways and schema markup for our content. The content teams now depend on technical SMEs to create brief trustworthy explanations instead of using keyword-based content. The implementation of credible well-formatted information on client pages leads to their appearance in Perplexity and Bing Copilot search results. The approach of using AI-first "prompt-style" phrases has proven unsuccessful. The approach fails to grow because LLMs transform input data during their processing phase. Our source data needs to be both accurate and free from errors because LLMs extract information from patterns which results in poor output when the input contains errors.
Working with surgeons showed me we needed to change our content. We ditched the formal FAQs and just answered questions like a real person would. That worked. Our clients started showing up more in Perplexity and other AI search. The tricky part is keeping your Google Business Profile consistent with those AI summaries. I recommend checking how AI tools cite your brand regularly, so you can spot the mismatches before they hurt your visibility.
My local SEO experience taught me that getting pulled into AI like ChatGPT and Perplexity is about how you answer, not just the keywords you use. Writing responses like you're talking to a neighbor, direct and helpful, works best. You might not land on page one of Google, but the increase in new leads from AI is real. I'd test different FAQ layouts and see which ones the AI actually cites.
So we've been testing how to get our content picked up by AI tools like ChatGPT. At CLDY.com, what actually worked was rewriting our FAQs into short, one-paragraph answers. We saw a real jump in how often we showed up. The old stuff, like keyword stuffing and long articles, did nothing. It felt strange to be so brief at first, but now it's how we build every new page. Just answer the user's question simply. That's what the AI wants.
Running ShipTheDeal, we noticed when people ask ChatGPT for deals, it's not about keywords anymore. We had to rethink things. We cleaned up our deal data and made our comparison charts easy for AI to read. Now our content gets picked up. My advice? Stop chasing keywords. Focus on making your content clear, well-structured, and full of your own honest take.
I run Tutorbase, and we found the old SEO tricks don't work as well now, not with stuff like ChatGPT. We started making our EdTech case studies super specific, listing exact time savings and user data. All of a sudden, our content started getting pulled directly into AI answers. My advice? Stop fretting about generic keywords. Show examples with real numbers. That's how people find stuff that actually works.
We've started releasing structured, data-rich content. These pages contain clear, factual information with our expert input, which is the type of content that artificial intelligence can trust. We've discovered during this process that each LLM relies on different ecosystems: Perplexity mostly cites Reddit and YouTube, Google relies on Quora, Reddit and LinkedIn, and ChatGPT typically draws from Wikipedia. We monitor each platform's referral sessions and analyse which ones best suit our target audience. AI-driven reach grows most quickly when you choose which LLM your target market interacts with the most and increase your visibility on the content sources that those models rely on.
Answer engines don't use keyword matching the way most people do. They know what something means and can compare it to a set of data. I'm making changes by writing in a way that matches how people ask full questions, rather than just using search terms. Using chat logs to find real user questions and turning them into headings and bullet points works best. I try to answer in the same conversational way someone would respond naturally. It's ineffective to use a large number of keywords or write lengthy pieces solely to increase the word count. These models obtain accurate information quickly.
I'm shifting our SEO to focus on content that AI can't easily replicate. We're moving away from chasing top-of-funnel "what is, how to..." keywords on the site, because AI and zero-click results are eating that traffic, and we're covering those topics on YouTube instead. On the website, we're doubling down on commercial and transactional intent, because those still lead to real leads. We're also layering in more PR and brand mentions, as answer engines seem to reward sites that are mentioned elsewhere. The current formula for us is: let the video handle awareness, let the site handle buying.
Our Generative Engine Optimization strategy centers on two major shifts - content focus and off-platform digital PR. Regarding content, we've found that large language models (LLMs) favor industry roundups and vertical comparisons over standard transactional queries. We now prioritize creating these assets to tactfully position our clients as the best use case. We've also dramatically reduced top-of-funnel articles and instead produce more research-based content, such as meta-analyses and original industry trend reports. This information gain is highly valued by LLMs. This strategic content shift is generating stronger inbound opportunities from ChatGPT and Perplexity specifically. Outside of our site, we've expanded our thought leadership to platforms like Reddit, LinkedIn, and Medium to be part of the conversation where LLMs source information. Lastly, we're started pursuing inclusion in industry roundups and competitor comparisons, as these mentions are proving to be extremely valuable for authority and discoverability within answer engines.