May 5, 2024, marked the first-ever leak of the most comprehensive collection of Google Search API ranking factors in the history of search engines – a truly historical moment we might not have seen if Erfan Azimi, founder & CEO of an SEO agency, hadn't spotted Google's documents that were mistakenly released on Github on March 27, 2024, and were just forgotten to be deleted. The irony is that they were published under the Apache 2.0 license, allowing anyone accessing the documents to use, edit, and distribute them. As such, sharing the documents with two of the most reputable SEO experts, Rand Fishkin and Mike King – the next step Erfan Azimi took after spotting the leak – was within the legal boundaries of the license. Both released the documents and accompanying analysis on May 27.
Navboost is a Google ranking algorithm that was revealed during the company's antitrust trial with the U.S. Department of Justice. It enhances search results for navigation queries by utilizing various signals like user clicks to identify the most relevant outcomes. Navboost retains past clicks for queries up to 13 months old and differentiates results based on localization and device type (mobile or desktop). This ranking signal is crucial for SEO professionals to understand and optimize for, as it can significantly impact a website's visibility in search results. Clicks are a primary ranking signal, indeed
Google has denied for years that clicks belong to a primary ranking factor. Its representatives, including Gary Illyes, have consistently emphasized that click-through rate (CTR) is a "very noisy signal" and that using clicks directly in rankings would be problematic due to the potential for manipulation. They have explained that while click data is used for evaluation and experimentation purposes to assess changes in the search algorithm, it is not a primary factor in determining search rankings.
The leaked documents prove otherwise. It does matter how many clicks a website can generate. The more on-page optimization and continuous content marketing you do, the more traffic you'll attract, resulting in more clicks, higher rankings, and higher conversion rates.
Google representatives have consistently misdirected and misled us about how their systems operate, aiming to influence SEO behavior. While their public statements may not be intentional lies, they are designed to deceive potential spammers—and many legitimate SEO professionals—by obscuring how search results can be impacted. Gary Ilyes, an analyst on the Google Search Team, has reiterated this point numerous times. He's not alone; John Mueller, Google's Senior Webmaster Trends Analyst, and Search Relations team lead, once stated they don't have a website authority score.
However, as the data leak suggests, Google does have an overall domain authority measure. As part of the Compressed Quality Signals stored on a per-document basis, Google computes a feature called "siteAuthority." According to Mike King, Founder and CEO of iPullRank, while this measure's specific computation and application in downstream scoring functions remain unclear, we now definitively know that Google's domain authority exists and is used in the Q* ranking system.
The recent Google Search API leak revealed the existence of white lists that are used to ensure the quality and reliability of information, particularly for sensitive topics like health and news, where misinformation could have drastic implications on public well-being.
But they help you find interesting articles. Think of all the small businesses that need SEO to survive. /s
OK actually they do but it's abused by big ones
If small business need manipulate algorithm to survive, then algorithm not good.