A Google patent details a method for classifying websites as low quality by evaluating their links. Named “Classifying Sites as Low Quality Sites,” this algorithm identifies specific factors to identify low quality sites. Understanding these factors is beneficial, even if it’s unclear whether they’re currently in use, as they can enhance SEO practices regardless.
### An Obscure Link Algorithm
The patent, dating from 2012 to 2015, aligns with the initial release of the Penguin algorithm. Discussions on this algorithm are scarce, and the information presented here is more detailed than usual, suggesting many might not be familiar with it. Understanding this algorithm is crucial, as parts of it, if used, could influence the SEO process.
### Just Because it’s Patented…
It’s important to remember that a patent does not guarantee implementation. This particular patent, coinciding with the Penguin Algorithm timeframe, offers insight into link ranking rather than site ranking. Its uniqueness could make it impactful on SEO if used.
Understanding potential algorithms is valuable as it clarifies possibilities and helps identify flawed SEO information.
### How the Algorithm Ranks Links
Named “Classifying Sites as Low Quality,” it ranks links rather than content, operating on the principle that low-quality links indicate low-quality sites. This approach might be resistant to spammy links as it acts post-ranking, which includes algorithms like Penguin. By filtering links, it creates a reduced link graph, eliminating spam.
The algorithm uses three ranking scores, termed “quality groups”: Vital, Good, and Bad. The total score calculated from these determines if a site or page is low quality, falling below a specific threshold.
### Implied Links
The patent introduces the concept of Implied Links, distinct from unlinked citations. There is speculation within the SEO community about this, and some researchers describe latent links as virtual connections. These arise from shared link relationships between separate sites, without direct links, hinted by mutual connections.
### Link Quality Factors
Knowing the patent-named factors can guide link strategy creation, even if it’s uncertain whether they’re in use. These factors include:
#### Diversity Filtering
This process discards redundant links from the same site, retaining only one to represent them.
#### Boilerplate Links
These may be ignored if they lack context, such as site-wide links appearing in navigation or footers.
#### Links That Are Related
Links from sites excessively linking to similar sites could indicate a spam network. The patent describes a group of sites linking to the same domains or IP addresses.
#### Links from Sites with Similar Content Context
Links sharing too closely aligned content context may be discarded, preserving only one representative link.
### Overview and Takeaways
The algorithm is described as enhancing search results, engaging post-ranking to assess inbound link quality and adjust site rankings based on link scores. It’s notable for evaluating links rather than sites themselves.
Overall, understanding this algorithm can provide insights into potential link ranking processes, offering guidance for improving SEO strategies.