In August, Google introduced one of its most notable updates of the year, which many SEO specialists saw as a partial rollback of the divisive Helpful Content Update—a potential acknowledgment of its previous impact.
While certain sites experienced improvements, others saw a significant drop in traffic.
This case study explores how the update affected a client’s site, the key metrics that were impacted, and the strategies we are implementing to regain visibility and enhance SEO performance.
Diagnosing the drop
During a recent SEO Office Hours podcast, a participant asked:
– “We weren’t affected by the August update, but an indexing bug around the same time decreased our traffic. How can we resolve this?”
My initial thought was the confidence behind that assertion. How sure are you that it was solely the indexing error if you’re only looking at traffic?
The error was rectified in a few days, so why wasn’t there a recovery in traffic and rankings?
Indeed, we can ignore the drops during those two bug days, but if they persist afterward, could it be related to the core update?
Identifying the true cause is not always straightforward. We encounter this frequently.
Some clients don’t know where to begin. Others simply lack the time to delve deeply. This was true for our new client.
Their competent SEO team needed assistance auditing the website and a partner to help address any issues.
Once a Google core update rollout is complete, multiple metrics should be examined.
As with any SEO matter, these depend on various factors such as your key performance indicators (KPIs), the type of website you manage, and the countries you target.
To evaluate the impact of the Google core update for our client, we employed a three-step approach:
– Comprehend what occurred sitewide.
– Use the sitewide indicators to further segment the data for analysis.
– Conduct an in-depth review of the most affected pages and website sections.
Understand what happened sitewide
Our client had a large site. Going through each page was time-prohibitive. We needed a comprehensive understanding to effectively segment and prioritize.
To achieve this, we analyzed:
– Traffic and conversion trends.
– Overall content health.
– Link profile.
– Technical SEO.
– A wild card: persona-based sitewide signals.
Traffic and conversion trends
Traffic and conversions were two main KPIs for our client, so our analysis started here.
We explored their GA4 and GSC data to assess organic trends and traffic patterns.
This broad view establishes a baseline for our analysis. For instance, if there’s a substantial traffic decline, it indicates that the drop may not be solely due to organic search issues.
We noticed a slight decrease in sessions overall but no impact on conversions, particularly around the Google core update in August, suggesting a potential role in the change, possibly attributed to organic factors.
This hypothesis was confirmed through further exploration of organic data in GA4 and GSC.
Traffic had significantly decreased according to both GSC and GA4.
Interestingly, similar to overall traffic, organic conversions remained relatively stable.
It was clear that the issue wasn’t with the website’s relevance to its audience; those who visited were converting. However, the overall number of users accessing the site had decreased noticeably.
Before examining the rankings, we segmented the data by country to pinpoint where the impact was most severe.
This led us to ask a crucial question: do we focus on it?
In this case, India was significantly affected. Although it would have been easy to start auditing this market for rankings, we realized the importance of the details.
iOS doesn’t dominate the operating system market in India. In 2023, Android had a 95.17% market share, while iOS held only 3.98%.
As our client’s solution is iOS-specific, India isn’t the right market to prioritize. Instead, focusing on the U.S., their most relevant and largest market, is crucial.
Overall content health
Next, we examined the overall content health, providing early insights into potential issues with content at scale. We utilized Screaming Frog for this content analysis.
To effectively use any tool, it’s crucial to identify what truly matters within the bulk of data.
In this case, it was easy to fixate on a less relevant metric like readability, which, for this technical audience, aligns with expectations.
The more pressing issue wasn’t readability but the significant number of near duplicates. Most affected pages were in the template directory—a crucial insight for segmenting data by website structure.
Note: This analysis differs from evaluating Google’s definition of helpful content. There are methods to assess content helpfulness at scale, primarily using Google’s NLP API to identify entities and analyze sentiment, which we plan to integrate soon.
Link profile
The client had already expressed interest in enhancing internal linking in Stage 2 of the project.
For now, our goal was understanding the situation using Screaming Frog.
Overall, the links seemed healthy, with valuable insights for refining SEO strategies.
For instance, when we evaluated internal outlinks with no anchor text, we traced it back to a specific template. Though not top priority, it’s an easy fix at the template level.
Technical SEO
Not every technical adjustment affects rankings, however intriguing they may be.
By this point, the suspicion was growing that content and changes in the SERPs were the primary issues.
Many websites struggle post-Google updates. We suspected Reddit and AI Overviews dominated the search results.
AI Overview, functioning similarly to a featured snippet, suggested a focus on structured data markup was needed.
This Screaming Frog analysis helped identify missing structured data at the template level and revealed obvious gaps.
Wild card: Persona-based sitewide signals
Developing well-delineated personas can be advantageous for SEO and marketing overall.
The client was concerned about the dilution of their website’s authority due to a recent persona expansion and shift towards a different market segment.
While I don’t believe proprietary metrics like DA or AS are direct ranking factors, a recent Google API leak indicated Google’s version of domain authority.
It’s understandable a company might think expanding to a new market could cause Google to question its target audience.
Technically, this coincides with semantic SEO principles—websites must focus on related content delivery.
To evaluate if the client faced challenges with the new vertical, we used a combination of:
– Ahrefs to export U.S. keywords and pages ranking.
– A Python script using NLP to analyze the pages and categorize personas.
The findings were revealing: most pages fit both personas, indicating closely aligned content.
This suggested the website didn’t deviate from its core topic too much.
Use the sitewide indicators for deeper analysis
Thus far, all signs indicated the website remained valuable and the traffic decline wasn’t due to sitewide problems.
– Challenges were linked to specific sections like the template directory.
– Most traffic came from the blog section.
However, thousands of pages posed a question: where to start?
Using exports from GA4, GSC, and Ahrefs’ Top Pages report, we mapped data for keyword-level impact analysis.
This labor-intensive process required substantial data manipulation with index-match functions and pivot tables.
Though complex, it allowed us to identify priority areas for further investigation, including:
– A label based on a percentage drop post-core update
– High for pages with a session decline >= 20% or conversions >=20%
– Medium for a session decline >= 10% or conversions >=10%
– Low/None for a session decline >= 5% or conversions >=5%
– Key events and keyword volume
– Priority for conversions >10 + volume
While not perfect, this analysis provided an initial list of 42 pages for closer evaluation.
Consider various factors, such as post-publication date, affecting these metrics.
Newer blogs might lack authority or initially benefit from an early boost falling later due to missing user signals, possibly linked to NavBoost.
Older pages were identified for refreshing.
By default, Screaming Frog doesn’t extract blog publication dates, so custom extraction was set up using a selector copied from the date element via Inspect in Chrome.
With publication dates alongside our list, we proceeded with in-depth analysis.
Deep dive into affected pages and sections
Manual review of affected pages revealed a clear pattern.
Our suspicions were confirmed: the primary cause of traffic decline was changes in the SERPs in the U.S. and U.K., driven by Reddit’s rise and AI Overviews.
Reddit emerged as a significant winner from the recent core update, impacting many of the client’s keywords considerably.
Moreover, AIO’s expansion, covering over 20% of queries in the client’s sector, has further contributed to the problem.
AIO impacts informational rather than commercial queries more significantly, which explains blog section visibility losses.
The client, along with many others, is losing valuable traffic due to Reddit and AIO, not the unhelpfulness of their website. Unfortunately, this isn’t easily solved.
We also inspected the templates directory. Although it wasn’t a traffic or conversion driver, it warranted exploration for two reasons:
– The client invested substantial time deploying it.
– Initial analysis indicated its implementation issues.
Like the priority blog pages, a pattern emerged upon reviewing a few template pages.
These pages failed to align with search intent, indicating content improvement was necessary.
Search intent is usually categorized into four main groups, but I advocate for a broader, nuanced understanding.
Someone searching for a template typically wants a selection to choose from, intending to find a few templates they like and use them.
Google recognizes this. A quick search for “Instagram template” yields top results from companies offering collection options.
Unfortunately, this wasn’t fully considered before launching the pages.
Instead of optimizing collection pages to match intent, the client optimized individual template pages, which bore insufficient user information. However, tweaking it wouldn’t make a difference when intent misalignment persists.
Fixing the ‘unfixable’
Our initial recommendations to the client had to take a holistic approach since no quick fixes existed for the identified issues.
To address these challenges, we advised:
– Implementing a few quick wins and tests.
– Developing an improved strategy considering these shifts.
– Closely measuring AIO and redefining SEO KPIs.
Quick wins and tests
Everyone seeks an “easy fix with high impact” solution. So, let’s prioritize these before delving into the complex problems.
Analysis revealed intriguing test opportunities, improvement areas, and several technical suggestions.
We recommended that the client optimize an initial content piece by aligning it more closely with E-E-A-T guidelines.
I’ve grown increasingly interested in information gain as a potential ranking element in Google’s system.
This concept is sensible, especially in an AI-driven era where new data for model training is valuable. We plan to assess these elements during our tests.
The audit highlighted the need to reconsider the template directory. With some adjustments, it could become a valuable traffic and conversion source.
First, we suggested revisiting the keyword strategy to better align with user intent, setting the stage for optimizing those pages.
Additionally, quick tech wins, particularly in Schema markup, were identified.
Strategic Schema implementation is key rather than marking everything up at once.
For instance, if you’re a SaaS product with excellent reviews, using Review Schema is crucial. Our client had several opportunities to utilize this to enhance their content.
Developing an improved strategy
The client must rethink their strategy, although the full details remain unclear despite initial ideas.
One option is focusing on user-generated content (UGC). Reddit capitalizes on UGC, seemingly favored by Google.
However, UGC alone isn’t sufficient to compete with Reddit. It should be part of the strategy, but not the primary approach.
Diversifying the keyword strategy is another option. While exploring less competitive terms where Reddit and AIO aren’t dominant might help, it’s a short-term fix.
Even amidst AI hype skepticism, it’s evident that AIO is an enduring factor that might even gain prominence.
The real question is whether this shift is beneficial for users and companies. Ultimately, any strategy needs to accommodate evolving search dynamics.
As AI increasingly influences early user journey stages, businesses, including our client, will need to rethink content production.
Whether AIO leads searchers or if they use chatbots like ChatGPT directly, many will only access websites during the decision stage. This transforms strategic and SEO approaches.
While targeting commercial keywords stays imperative, the focus will shift towards brand building and producing diverse content—such as video—to rank in video SERPs and on YouTube.
Some content that lost rankings on our client’s site could perform well in video format.
Such changes aren’t negative. Relying solely on Google was never advisable.
Today, strategies must extend beyond SEO, integrating with broader digital marketing endeavors. As a proponent of full-stack marketers, I believe this shift offers exciting possibilities.
Closely measuring AIO and redefining SEO KPIs
Our client ranks in AIO for some terms, but tracking impact on CTR is unfeasible in GSC or GA4—and likely will remain so.
According to Liz Reid, Google Search Head:
– “And we see that the links included in AI Overviews receive more clicks than if the page had appeared as a traditional web listing for that query.”
While this may be generally true, our client was losing clicks, and AIO clicks represent a different user experience.
Measuring AIO clicks remains a challenge needing resolution.
We recommended enhancing the client’s SEO tech stack to include platforms aiding in this tracking.
Currently, they use Ahrefs, which doesn’t yet support AIO tracking (though it likely will soon). Switching to a tool like Semrush would be a logistical challenge for such a large site, so we suggested supplementing with ZipTie.
However, measurement involves more than tools—it necessitates a KPI shift.
Traffic, conversions, and revenue remain key, but brand KPIs and micro-conversions are more needed now than ever.
With search evolution and data privacy changes, tracking every click isn’t feasible.
Building brand resilience and focusing on smaller touchpoints, like newsletter signups and engagement rates, can help fill the gap. As the market evolves, so must our KPIs.