It seems that links no longer rank sites in the same way they used to. A strong batch of links no longer guarantees ranking for a phrase. Other ranking factors may be reducing the influence of links. The following is my opinion, and you may disagree.
For several years, I’ve believed we’re in a search environment where the influence of links has diminished. Links still matter, but in my opinion, content has gained more influence in what gets ranked.
First Barrier to the SERPs: The Core Algorithm
A common question in black hat groups is about the "Pop and Drop." This happens when a site gets links, pops into the SERPs, and then drops off. The links were devalued, which I believe is the Penguin algorithm working in real-time as part of the Core Algorithm.
By real-time, I don’t mean instantaneously. There’s a delay between web indexing and recalculating values across SERPs. Google refers to this as real-time.
Second Barrier to the SERPs: Content Analysis
Bing has relied more on content than links. In my opinion, links play a role in validating a site’s authority, usefulness, and trust. This is crucial.
Trustworthiness and Links
Once you remove anchor text spamming, what remains is a signal indicating trustworthiness.
To what extent should Google trust anchor text? There’s evidence Google might use surrounding text in a way similar to how anchor text was used.
SEOs often overlook Bing and can’t answer how to rank on it. This is because Bing focuses on understanding content and user preferences, while SEO often focuses on keyword and link-building strategies for Google, which are becoming outdated. Google has been enhancing the ranking power of on-page and user preference signals, much like Bing.
How Should Link Building be Approached?
In my opinion, link building should focus on proving trustworthiness and ensuring machines understand the niche our web pages fit into. Communicate trustworthiness by being selective about sites you obtain links from and cautious about where you link out.
For link building, it’s important that the page your link is on has relevant content and that outgoing links are relevant. Not just from the page, but the entire site. What good is a link from a reputable page if the rest link to low-quality sites? Such a site will lose its ability to pass PageRank. This reflects how a reduced link graph operates.
Can Google Identify Paid Links?
Speculatively, Google may use outgoing links as a factor for identifying paid links, not just from one page but from the entire site.
News websites allowing contributors who sell links under the table often outlink to sites that categorize them into negative link circles. In my opinion, these links don’t pass PageRank.
Various algorithms show that re-classifying sites based on outlink patterns works, with trust type algorithms and anti-trust algorithms re-classifying sites and creating a reduced link graph. These are similar to the Penguin Algorithm and hinder PageRank propagation.
Links Overruled?
Sometimes, rankings based on links are overruled. The modification engine—a set of algorithms related to personalization—can influence this. Geography and previous searches can affect the sites you see.
If most people searching with a query are from a geographic area, Google might rank a page from page 2 near the top if it likely satisfies users from that area.
TAKEAWAY #1 – Links for Inclusion
Even when core ranking factors are set aside, links still matter—for inclusion. To remain in the SERPs, consider the outbound links on your site and the links you obtain. Think of reduced link graphs, keeping non-spam within trusted circles.
In my view, you must be part of the trusted Reduced Link Graph to stay active.
TAKEAWAY #2 – Lose the 200 Ranking Factors
As a link builder, determine why Google ranks a site in positions 1-3. It’s not always about links or on-page content. Adjust your strategy accordingly. Avoid obvious reasons.
Just because a site has many “powered by” footer links doesn’t mean they power those rankings. In modified SERPs, traditional ranking factors are set aside, so those links aren’t the driving force.
Focus less on 200+ ranking factors when diagnosing why certain sites rank in top positions. If obvious ranking factors don’t make sense, consider the perspective of the Modification Engine.
Failing to set aside 200+ factors may cause you to miss the real reason a site ranks, like when “powered by” links are believed to be significant, but user intent in the content might be the actual reason.
If Google prefers sites from a specific area, get links from sites linked to that area, whether in the site’s name or Whois registration data.
Sometimes Google favors educational, scientific, or informational sites. Determine and incorporate this into your link-building strategy.
Stop thinking in terms of 200+ ranking factors. Focus on competitor analysis and content creation based on user preferences shown by Google’s rankings.
You can’t simply earn, build, or buy links to rank in the top positions if your page isn’t the same type as those currently ranking high.
Did Google Win the Link War?
There’s an issue affecting link usefulness: how they’re counted or not counted. It’s been years since research has been published about combating spammy links or promoting legit links between sites.
Today’s research focuses on understanding content and user intent, suggesting the link war is over.
Have search engines won?
Not in my view. Google focuses on users, and practical publishers should also focus on what users want.
Do SEOs Focus Too Much on Links?
Googlers have said SEOs fixate on links. I tend to agree. This belief fuels the private blog networks and paid link commerce.
Links are important, but so is creating the content that earns them. Understanding search queries and pages is essential, thus it’s wise to return to basics and focus on understanding what users want (search queries) and how it relates to on-page content.