After the Google update, I stepped back and reviewed my B2B sites instead of reacting fast. I didn't look at publish dates to explain traffic drops. I looked at intent. Pages that lost visibility usually answered the query halfway or leaned on thin supporting content. That told me exactly where the gaps were. I also avoided pushing new content right away. I focused on improving pages Google already showed positive signals for, adding more depth and clearer answers for users. The one tactic that worked best was consolidating overlapping posts into fewer, stronger topic pages. We merged similar content, added firsthand examples, and tightened internal links so each page owned a single query. On one site, this recovered close to 80% of lost organic traffic within eight weeks, based on Search Console data. My takeaway is simple. Skip shortcuts. Improve what already works and make it more useful for real users.
Hi there, After the June 2025 Core Update, I had to pivot my strategy for a B2B Fintech client whose primary "pillar" guides were beginning to slip in rankings despite their depth. Here is the exact approach I took to stabilize and grow our visibility. My strategy: The content refresh model When the June update hit, it became clear that Google was prioritizing real-time utility over historical authority. My most effective tactic was moving from a "publish and forget" mindset to a Rolling Refresh Model. Instead of launching new content, I dedicated 60% of our resources to systematically auditing and updating our top-performing assets every 90 days. This wasn't just about changing a year in the title; we were about replacing outdated data points, refreshing our expert quotes to reflect the current market, and ensuring our "Experience" signals (E-E-A-T) were updated with new case studies from the previous quarter. New angles and insights that keep the page up-to-date and fresh. Why did it work? Frequent updates serve two purposes in the current search landscape: - Crawl frequency: By updating high-value pages, we saw an immediate increase in crawl frequency. Google's "Freshness" algorithm rewarded us because our content better addressed the volatility of Fintech regulations and rates than static competitors. - User satisfaction: In B2B, a user who sees a "2023" statistic in a 2025 search result will bounce immediately. By improving data accuracy, we improved our dwell time and reduced bounce rates, signals that told the algorithm our site was still the most relevant result. The result Within six weeks of implementing this rolling audit, we didn't just recover our pre-update traffic, we saw a 22% increase in organic leads from our core service pages. The transition proved that in 2025, topical authority isn't something you "earn" once; it's something you have to maintain through constant evolution. Regards, Nikolay Krastev
"Dealing with a major Google algorithm update is always a bit of a "heart-in-your-throat" moment. I remember one specific core update—I think it was the 2023 Helpful Content Update—where we saw a steady and continuous slide in rankings and traffic for a client who had been dominating for years. It's easy to panic and start changing meta tags or disavowing links, but the reality is that modern updates are rarely about "tricks" anymore; they're about how well you actually satisfy the user's intent. How I Adapted the Strategy The first thing I did was stop all new publishing. There's no point in pouring water into a leaky bucket. Instead, I shifted the entire team toward a content audit and "relevance" cleanup. We looked at our top 50 pages that lost traffic and asked: "Is this page actually helpful, or is it just 2,000 words of SEO fluff designed to rank for a keyword?" We found that a lot of our content was "me-too" content—it didn't say anything that the top 3 results weren't already saying better. To adapt, we moved from a "volume-first" approach to a "distinctive value" approach. We started interviewing internal subject matter experts to add unique insights, original data, and actual "takeaways" that a LLM or a basic AI writer couldn't replicate. If I had to pin it down to one tactic that moved the needle, it was prioritizing Information Gain. In the SEO world, if your article provides the exact same information as every other article on page one, Google has no reason to rank you. To combat the update, we went through our declining posts and added unique elements that didn't exist elsewhere: Original Charts/Data: We pulled internal CRM data to show real-world trends. First-Person Experience: We added "I" and "We" back into the copy—explaining how we tested a product or why a certain strategy failed. Contrarian Opinions: We challenged the "industry standard" advice where it made sense. Why it worked Google's recent updates are designed to filter out the "SEO echo chamber." By adding something new to the conversation—rather than just restructuring the same old keywords—we signaled to the algorithm that our page deserved to exist independently of the competition. Within about two months, we didn't just recover; we actually surpassed our previous peak traffic because our content was finally "authoritative" in the true sense of the word." - Sean Chaudhary (Have contributed 50+ blogs to GoDaddy over the years)
When a major algorithm update hits, my first move is not to "fix" anything immediately it's to understand what Google is rewarding now, not what it's penalising. In recent updates (particularly those tied to helpful content and AI-driven interpretation), I shifted strategy away from reactive tweaks and towards improving how pages are understood, not just ranked. How I adapted the strategy Segmented impact analysis I broke performance down by: Intent type (informational vs commercial) Page type (category, PDP, guides) Query behaviour (pre- and post-purchase) This showed that visibility losses weren't universal they were intent-specific. That insight stopped me from making blanket changes. Re-aligned content with search interpretation Instead of adding more content, I focused on: Clear primary intent per page Stronger topic framing in the first 20-30% of content Removing competing intents on the same URL This was especially important as Google increasingly relies on partial page extraction and AI summaries. Strengthened trust and clarity signals I tightened: Internal linking around topic clusters Entity consistency (brand, products, expertise) Supporting evidence (FAQs, specs, comparisons, real use cases) The goal was to make pages easier to interpret and trust, not just longer. The single most effective tactic, Intent simplification at page level. The biggest wins came from auditing pages and asking one question: "If Google or an AI model only read part of this page, would the intent still be obvious?" By simplifying pages to serve one dominant intent, removing dilution, and reinforcing that intent early and structurally, I saw Faster recovery post-update, Stronger rankings stability, Improved performance in AI-driven SERP features In short, the transition wasn't about chasing the algorithm it was about making content unambiguous, extractable, and useful, which is exactly what modern search systems are trying to reward.
After Google's Helpful Content update, one of our Web3 clients—a DeFi analytics platform—saw traffic dip across pages that previously ranked well. The content was technically solid, but we realized it was overwhelming users with too much information in one go: long-form explainers trying to cover the entire DeFi ecosystem in a single post. We restructured the strategy around user intent clarity. Instead of one giant article like "How to Use DeFi Tools," we split it into focused pages: tracking wallets, analyzing gas fees, monitoring yield, and comparing protocols. Each piece addressed a single pain point and linked back to a main hub. Rankings returned, and we also started capturing featured snippets and breakout traffic from "People Also Ask" results. The most effective move? Breaking up content by function, not format—especially in a space as complex as Web3.
International SEO Consultant, Owner at Chilli Fruit Web Consulting
Answered 3 months ago
When rankings took a hit, I made a decision to change how we structured content. So, I started treating the first screen like a pitch deck: I compressed everything into 3 to 5 scannable blocks: concise summaries (TLDR;), FAQs, and clear pros and cons at the top. That one change got us into AI Overviews and zero-click spots, which kept the brand visible even as traffic dropped. For one e-commerce client, traditional CTR fell but branded impressions in AI surfaces went up 40% over two months. Simply because we designed around extraction. If your first 100 words work out of context, you're on the right track.
After the March core update, we noticed a big change in how Google treated singular keywords differently, compared to plural versions of the same keywords, in the SERP's. Prior to the update, Google pretty much treated singular words the same as plural versions in the SERP's. We noticed that one of our clients has a two word name ending in a plural, and the company name is a primary keyword. Where we used to rank 1, 2 or 3 for both versions of this keyword, all of a sudden we ranked for the plural in slots 1-3, but when searching the singular version, we were ranking much lower on the page, in slots 6-12. The frustrating aspect of this change is most people search for the singular version of the primary keywords. How did we adapt? We started implementing a higher rate of singular versions of the primary keywords in any new content we published, and we revised the current product descriptions to reflect the singular versions of the keywords. The results: the client had their best sales month of all time in November 2025 after 10 years in business.
When the last major Google update rolled out I shifted focus away from publishing more pages and doubled down on being cited as a subject matter expert on authoritative sites. Instead of chasing links I prioritised expert contributions where my name and experience were quoted in context on industry publications and business media. The tactic that mattered most was proactively responding to journalist callouts with specific operational insights drawn from real work rather than generic SEO advice. Those mentions consistently drive referral traffic and coincided with ranking stability. The biggest lesson was that authority signals built through real expertise held up far better than on site tweaks during the transition.
After major updates, there are always winners and losers. Sometimes, due to your own changes, and sometimes because competitors shifted in ways that reshuffled the results. After 10 years in the industry, I've learned not to chase every update and instead focus on what updates consistently reward: better user experience and clearer, more helpful content, even if there's short-term volatility. The single most effective tactic during these transitions is a disciplined content audit that identifies pages that are stale, thin, or no longer serving users. Once you find opportunities for improvement, you can update, consolidate, deprecate, or repurpose them so your site's quality signals improve and your strongest pages earn the attention they deserve. The goal is to make sure every page on your website is pulling its own weight, because a rising tide lifts all boats!
Since the December 2025 Core Update, the following pattern emerged in a handful of client sites: "best of" category pages and comparison content in the mid-funnel were less stable in the SERPs, while brands and specific sites with product authority had more visibility. The most successful single approach was to upgrade generic "best of" and comparison sites to specialist, product-authoritative landing pages. This involved improving low-content list sites by promoting the best content to a corresponding category or service landing page, as well as providing proof that a real company is behind the suggestions: clear selection criteria, direct product information, contextual information about availability, great links to the specific step that follows, and short FAQs that answer what buyers are asking right before making a purchase. The strategy succeeded because it favors sites that are most authoritative on that particular subject, as opposed to the loudest and most general.
After the March Core Update in 2024 we realized that longer articles with multiple search intents lost in rankings on our website but also on customer websites. We figured they were not as helpful to the reader anymore. We started a bold move and that was to create micro-content 300-500 words built around a single search intent and a series of smaller follow-up-questions and answers based on the main title built into one micro-blogpost in Q&A format. Central to these posts were real user questions in the page and SEO title we got from PeopleAlsoAsked and the immeditate and helpful answer in the introduction. This micro content proofed to be super successful after the update but also throughout the introduction of AI Overviews and AI Mode. With the help of AI developers, own-built software and the right user prompts we were quickly able to scale this across our customer portfolio (td. we have ~ 1.600 customers) and the average growth in organic traffic in the last year has been 60% at portfolio level across 70 different industries. We combine this approach with a topical cluster strategy to give us guidance which questions of real users we need to answer via our customer's micro blogs.
When Google rolled out the June, August and December updates, we didn't rush to change anything. We've seen enough updates over the years to know that early movement doesn't always mean something is wrong. While each update was happening, we mostly watched. Rankings moved around, some pages went up, others dipped, and then things started to settle. Once it was clear the update had finished, we went back and reviewed what actually changed. We looked at pages one by one and asked some honest questions. If something dropped, was it still useful? Was it clear? Would someone reading it feel confident calling the business, or would they still have questions? We kept publishing new pages during this time, but we slowed down and were more selective. If a page didn't add something useful, we didn't publish it. There was no point adding more noise. We were also careful not to make big changes without a clear reason. We didn't rewrite whole websites or remove pages just because rankings shifted. If we couldn't point to a real issue, we left things alone. A good example of this was a roofing company we work with in Perth. After the August update, a few of their main service pages slipped slightly. Pages around roof repairs and restorations weren't showing as often as they had before. Instead of starting from scratch, we read those pages as if we were homeowners. The information was fine, but it felt a bit generic. It didn't reflect what roofing work in Perth actually involves. So we made small changes: We added photos from jobs they'd completed around Perth We explained common problems they see on local roofs, like heat damage and coastal wear We answered the same questions customers usually ask before booking We removed wording that sounded vague or salesy Nothing complicated. We didn't touch the layout or structure. After the December update finished, those pages came back and then improved. The calls that came through were also better — people knew what they were calling about and were ready to book work. If there's one thing that helped more than anything else, it was this: We wrote pages the same way we'd explain the job to someone in person. We weren't trying to please Google. We were trying to explain the work clearly, using real examples, so people could decide if the business was right for them. Once we did that, the rest took care of itself.
I treated the update like a forced audit. I cut or noindexed anything that looked like it was "for Google" instead of for people: thin posts, duplicated guides, city-page clones, old linkbait. That alone made the site smaller but stronger. Then I stopped thinking in terms of single keywords and pages, and moved to topics. For each core problem the business solved, I built one main hub page and mapped every related query and subtopic to it. Old scattered posts were merged into that hub or turned into clear supporting pieces that linked back in. That made it easier for Google and users to see who should "own" a topic. The tactic that helped most was going to sales and support to mine real questions. I pulled call notes, chat logs, proposal docs and emails where people hesitated, pushed back on price, or compared us to rivals. I used those exact words to rebuild key pages. On those pages I: - Led with a clear, direct answer in plain language. - Added chunks we'd missed: pricing logic, who it's not for, comparisons, timeframes, risks. - Removed filler written just to wedge in extra keywords. Traffic didn't snap back in a week, but the pages that aligned with those real questions stabilised first and then grew. The big win was that future updates hurt less, because the content was tied to customer intent, not to a specific version of the algorithm.
When a major Google algorithm update hits, the most effective strategy is often the most difficult one to execute: disciplined inaction. The natural instinct for many SEOs is to panic and start making drastic changes to the site as soon as rankings fluctuate, but this is a mistake. During the rollout and for a short period immediately following it, the best course of action is to do nothing at all. Rapid, radical shifts in strategy can look like a direct attempt to manipulate the results, which can raise red flags for the algorithm and potentially trap a site in a downward trend that becomes much harder to escape. Instead of making structural changes, this transition period should be used exclusively for deep data collection and observation. You need to wait for the "dust to settle" so you can objectively analyze how the Top 10 has shifted. By comparing your site against the new winners in the SERPs, you can identify which specific factors Google is now prioritizing—whether it's factual density, site speed, or the depth of expert citations—without making premature adjustments that might inadvertently damage your long-term authority. The single most effective tactic during these transitions is maintaining your baseline of high-quality work while performing a "Gap Analysis" on the updated competition. You look at who is growing compared to your own positions and adjust your strategy only after the update is fully confirmed. Staying the course prevents you from chasing temporary "ghost" fluctuations and ensures that when you finally do make a move, it is based on the new reality of the search landscape rather than a knee-back reaction to a volatile update period.
When the update hit, I stopped worrying about "SEO tricks" and focused on trust. I added real author info, shared actual examples of our work, used stronger sources, and showed proof of results. I realized that Google clearly just wanted to see that real people with real experience were behind the content and once I improved the credibility of the site, rankings leveled out again.
Founder, Editor & Ops for Search Engine Optimization (SEO), Content Marketing, digital Strategy, social media marketing, Content Strategist, and Search Marketing at SEOSiri
Answered 3 months ago
In response to AI changes in search, we shifted our SEO toward Answer Engine Optimization. The single most effective tactic was creating citation-worthy assets like templates and checklists with structured schema so we became a direct source for AI answers. We measured progress by tracking brand mentions in AI Overviews, which drove gains in branded searches and direct traffic.
Hi GoDaddy team, So after that Gemini 3 update dropped, it was obvious to me that old-school SEO just wasn't gonna cut it anymore. AI Mode and Voice Search started skipping websites entirely. So me and my team had to pivot hard, we reworked everything. Started building these tight little content clusters made just for AI; no fluff, clear sections, short answers. And we didn't just optimize for keywords, we focused on how AI actually reads and spits content back out. But honestly, one of the biggest wins came from how we handled Google Business Profiles. With AI Overviews now leaning toward showing GBP links instead of websites, we had to adapt fast. So we ramped up the Q&A, posted consistently, and got super intentional about reviews. And sure enough, our home services client started seeing more action from their GBP than their actual website. That didn't just "happen", it's because we treated the profile like a lead channel, not an afterthought. So yeah, I think right now, the most effective tactic is focusing hard on GBP. Ivan Vislavskiy CEO and Co-founder of Comrade Digital Marketing Agency 332 S Michigan Ave #900, Chicago, IL 60604 Ivan's Website: https://ivanvislavskiy.com/ LinkedIn: https://www.linkedin.com/in/ivan-vislavskiy-53bb559 Headshot: https://drive.google.com/drive/folders/1mcN1EWjwYyzGu0E_Bw6J1TBHtmRjwkip?usp=sharing
Technical Product Manager and Director of Digital Marketing at Patio Productions
Answered 4 months ago
When we released a large number of products, it made it hard for us to make all those products work properly during some of the most aggressive and damaging site-wide updates from Google in recent years focusing on removing thin content. The team I work with found that approximately 40% of the pages that had been crawled and indexed by Google didn't generate any organic traffic and significantly diluted our domain authority. Instead of attempting to manipulate the ranking of every single product variant individually we chose to remove the low value product variants from Google's index and merge the product variants into robust parent category pages. Our goal was to be as diligent as possible about managing inputs for search engines because search engines will always favor the sites that provide users with the solutions they need faster. The most successful tactic we employed was using rel=canonical tags and 301 redirects on over 2,000 duplicate variations of product URLs, which allowed for a better use of our crawl budget and an 15.40% increase in organic revenue, over a 3-month time frame from the consolidation of these duplicate URLs. Most marketers spend all their time trying to create more content, but the greatest value lies in eliminating friction that exists between the buyer and the product they are interested in purchasing. We took a hit to our total indexed page count in the short term for the greater good of establishing credibility and trust with both the algorithm and our customers who purchase furniture.
When Google's helpful content update dropped in late 2023, we honestly panicked for about a week. Traffic dropped 40% and we couldn't figure out why since we weren't doing anything sketchy. What saved us wasn't some genius plan. We basically got lazy with our content. Instead of spending hours on perfect blog posts, our designers and devs just started writing about whatever bugged them that week. Stuff like "Why do clients always want carousels when nobody actually clicks them" or "I just wasted 6 hours on this idiotic CSS bug." Those rough, complaining posts with screenshots from real projects started ranking way better than our nice clean articles ever did. One of our developers wrote about a Shopify migration disaster at 11pm after a really bad day. Barely edited it. That post still gets us 3 solid leads every month. Turns out Google can now spot the difference between content you write to game rankings and content you write because you genuinely have something to vent about. We kind of backed into authenticity because we were too exhausted to keep up the professional act.
Having navigated multiple algorithm updates over the past decade, I've learned that adaptability and foresight are crucial in maintaining a resilient SEO strategy. When Google rolled out the core algorithm update in 2020, our approach at CheapForexVPS shifted significantly to prioritize user intent over keyword density. By analyzing search trends and leveraging tools like SEMrush and keyword gap analysis, we identified that long-tail keywords aligned with our audience's pain points yielded a substantial increase in organic engagement. For example, instead of targeting broad terms like "Forex VPS," we optimized for "best Forex VPS for low latency" and saw a 37% growth in qualified traffic within three months. Beyond keywords, we doubled down on content relevance by refreshing older posts and integrating structured data for better SERP visibility, which improved click-through rates by 22%. Having spent over 12 years directing Sales, Marketing, and Business Development, I've observed the long game in SEO is about syncing business goals with search intent. Tracking behavioral analytics, staying agile, and being proactive with updates like Core Web Vitals meant that even during volatile algorithm changes, we saw consistent ROI growth. The key takeaway? Prioritize solving real user problems with targeted, data-informed content—search engines reward that.