this post was submitted on 11 Jun 2024
93 points (96.0% liked)

Technology

59569 readers
3825 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

May 5, 2024, marked the first-ever leak of the most comprehensive collection of Google Search API ranking factors in the history of search engines – a truly historical moment we might not have seen if Erfan Azimi, founder & CEO of an SEO agency, hadn't spotted Google's documents that were mistakenly released on Github on March 27, 2024, and were just forgotten to be deleted. The irony is that they were published under the Apache 2.0 license, allowing anyone accessing the documents to use, edit, and distribute them. As such, sharing the documents with two of the most reputable SEO experts, Rand Fishkin and Mike King – the next step Erfan Azimi took after spotting the leak – was within the legal boundaries of the license. Both released the documents and accompanying analysis on May 27.

Navboost is a Google ranking algorithm that was revealed during the company's antitrust trial with the U.S. Department of Justice. It enhances search results for navigation queries by utilizing various signals like user clicks to identify the most relevant outcomes. Navboost retains past clicks for queries up to 13 months old and differentiates results based on localization and device type (mobile or desktop). This ranking signal is crucial for SEO professionals to understand and optimize for, as it can significantly impact a website's visibility in search results. Clicks are a primary ranking signal, indeed

Google has denied for years that clicks belong to a primary ranking factor. Its representatives, including Gary Illyes, have consistently emphasized that click-through rate (CTR) is a "very noisy signal" and that using clicks directly in rankings would be problematic due to the potential for manipulation. They have explained that while click data is used for evaluation and experimentation purposes to assess changes in the search algorithm, it is not a primary factor in determining search rankings.

The leaked documents prove otherwise. It does matter how many clicks a website can generate. The more on-page optimization and continuous content marketing you do, the more traffic you'll attract, resulting in more clicks, higher rankings, and higher conversion rates.

Google representatives have consistently misdirected and misled us about how their systems operate, aiming to influence SEO behavior. While their public statements may not be intentional lies, they are designed to deceive potential spammers—and many legitimate SEO professionals—by obscuring how search results can be impacted. Gary Ilyes, an analyst on the Google Search Team, has reiterated this point numerous times. He's not alone; John Mueller, Google's Senior Webmaster Trends Analyst, and Search Relations team lead, once stated they don't have a website authority score.

However, as the data leak suggests, Google does have an overall domain authority measure. As part of the Compressed Quality Signals stored on a per-document basis, Google computes a feature called "siteAuthority." According to Mike King, Founder and CEO of iPullRank, while this measure's specific computation and application in downstream scoring functions remain unclear, we now definitively know that Google's domain authority exists and is used in the Q* ranking system.

The recent Google Search API leak revealed the existence of white lists that are used to ensure the quality and reliability of information, particularly for sensitive topics like health and news, where misinformation could have drastic implications on public well-being.

top 4 comments
sorted by: hot top controversial new old
[–] far_university1990@feddit.de 12 points 5 months ago (1 children)

spammers—and many legitimate SEO professionals

So, spammer and… professional spammer.

[–] redditReallySucks@lemmy.dbzer0.com 2 points 5 months ago (1 children)

But they help you find interesting articles. Think of all the small businesses that need SEO to survive. /s

OK actually they do but it's abused by big ones

[–] far_university1990@feddit.de 4 points 5 months ago

If small business need manipulate algorithm to survive, then algorithm not good.

[–] rob_t_firefly@lemmy.world 2 points 5 months ago

The ugly AI-generated illustration of the "gorgle search leabk" does not add anything of value to this article.