2,596: How To Make The Most Out Of Google’s Leaked Ranking Factors

Last updated on

In the past week, I’ve noticed numerous objections to thoroughly examining the 2,596 pages.

However, the primary inquiry we should focus on is, “How can we maximize our testing and learning from these documents?”

SEO operates as an applied science, wherein theory serves as a foundation for experiments rather than the ultimate objective.

14,000 Test Ideas

It’s hard to find a more fertile ground for generating test ideas. However, not all factors can be tested equally. They vary in types (such as number/integer: range, Boolean: yes/no, string: word/list) and reaction times (indicating how swiftly they influence organic rank changes).

Consequently, we can conduct A/B tests on fast and active factors, whereas we need to resort to before/after tests for slow and passive ones.

To systematically test ranking factors:

  1. Choose a ranking factor.
  2. Determine the impacted (success) metric.
  3. Define the testing environment.
  4. Specify the type of test to be conducted.

Ranking Factors

While many ranking factors in the data leak are represented as integers, indicating a spectrum, certain Boolean factors lend themselves to straightforward testing:

  • Image compression: Yes/No?
  • Intrusive interstitials: Yes/No?
  • Core Web Vitals: Yes/No?

Factors within your direct control include:

  • UX (such as navigation, font size, line spacing, and image quality).
  • Content (ensuring freshness, optimizing titles, avoiding duplication, incorporating relevant entities, aligning with user intent, maintaining high quality, attributing original sources, using canonical word forms rather than slang, fostering high-quality user-generated content, and featuring expert authorship).
  • User engagement (measured by a high rate of task completion).

Announcing NEW AI-Driven Capabilities and Enhanced UI in Moz Pro
Uncover insights swiftly and enhance decision-making with our latest AI-driven features, a redesigned user interface, and optimized SEO workflows.

Factors to Deteriorate (Negative) Ranking:

  • Backlinks from low-quality pages and domains.
  • Overly aggressive anchor text usage (unless supported by a robust link profile).
  • Substandard navigation experience.
  • Weak user engagement signals.

Influential Factors with Limited Direct Control:

  • Relevance and alignment of titles between the source and linked content.
  • Click-through rates on links.
  • Backlinks from reputable and newly established pages.
  • Domain authority.
  • Mentions of your brand.
  • Homepage PageRank.

Before implementing changes, conduct a thorough evaluation of your performance in the specific area you intend to test, such as Core Web Vitals.


Based on the factors described in the leaked document and their potential impact on various metrics, here’s the alignment:

  1. Crawl rate: Monitoring the frequency of crawls by search engine bots.
  2. Indexing (Yes/No): Determining if a page has been indexed by search engines.
  3. Rank (for main keyword): Tracking the position of a webpage for its primary keyword.
  4. Click-through rate (CTR): Assessing the percentage of users who click on a search result compared to those who see it.
  5. Engagement: Evaluating user interaction and involvement with the webpage, possibly measured by metrics like time on page, bounce rate, and pages per session.
  6. Keywords a page ranks for: Identifying the range and relevance of keywords that lead to the page being displayed in search results.
  7. Organic clicks: Counting the number of clicks a webpage receives from organic search results.
  8. Impressions: Recording the number of times a webpage appears in search results.
  9. Rich snippets: Observing whether search results for a webpage include enhanced elements like featured snippets, reviews, or ratings.

Where To Test

When testing, opt for a country-specific domain or a platform with minimal risk exposure if you’re uncertain. For multilingual sites, consider implementing changes in one country initially and gauge performance against your primary market.

Isolating the test impact is crucial; focus on single-page alterations or specific subdirectories.

Concentrate tests on pages targeting particular keywords (e.g., “Best X”) or user intentions (e.g., “Read reviews”).

Keep in mind that some ranking factors apply across your entire site, such as site authority, while others are specific to individual pages, like click-through rates.


Ranking factors possess the potential to either complement or contradict each other as integral components of an equation.

Humans tend to struggle with grasping functions comprising numerous variables, suggesting that we likely underestimate the intricacies involved in achieving a high rank score, as well as the substantial impact a handful of variables can wield on the outcome.

Despite the intricate interplay among ranking factors, it’s essential not to shy away from experimentation.

Aggregators often find testing more straightforward than Integrators, benefiting from a wealth of comparable pages that yield more pronounced results. Conversely, Integrators, tasked with generating original content, encounter variations across each page, which can dilute test outcomes.

One of my preferred methodologies involves self-assessment of ranking factors to deepen one’s understanding of SEO, followed by methodically challenging and testing assumptions. Create a spreadsheet listing each ranking factor, assign a numerical value between zero and one based on perceived importance, and then multiply all factors together.

Monitoring Systems

Testing merely provides an initial insight into the significance of ranking factors. However, it is through monitoring that we can observe and analyze relationships evolving over time, leading to more substantial conclusions.

The concept entails tracking metrics indicative of ranking factors, such as Click-Through Rate (CTR) reflecting title optimization, and plotting them over time to assess the effectiveness of optimization efforts. This approach is akin to standard monitoring practices, with the addition of novel metrics.

Monitoring systems can be constructed using various platforms, including:

  • Looker
  • Amplitude
  • Mixpanel
  • Tableau
  • Domo
  • Geckoboard
  • GoodData
  • Power BI

The choice of tool is secondary to selecting the appropriate metrics and URL paths for analysis.

Example Metrics

To gauge the impact of optimizations, consider measuring metrics either by page type or a defined set of URLs over time. Here are some key metrics to track, although I encourage you to question and refine these thresholds based on your personal experience:

User Engagement:

  • Average number of clicks on navigation.
  • Average scroll depth.
  • Click-Through Rate (SERP to site).

Backlink Quality:

  • Percentage of links with high topic-fit/title-fit between source and target.
  • Percentage of links from pages younger than 1 year.
  • Percentage of links from pages ranking for at least one keyword in the top 10.

Page Quality:

  • Average dwell time (compare between pages of the same type).
  • Percentage of users spending at least 30 seconds on the site.
  • Percentage of pages ranking in the top 3 for their target keyword.

Site Quality:

  • Percentage of pages driving organic traffic.
  • Percentage of zero-click URLs over the last 90 days.
  • Ratio between indexed and non-indexed pages.

It’s ironic that the leak occurred shortly after Google began displaying AI-generated results (AI Overviews), as we can now leverage AI to identify SEO gaps based on the leaked information.

One method involves analyzing title matching between the source and target for backlinks. Utilizing common SEO tools, we can extract titles, anchor text, and surrounding content from both the referring and target pages.

Next, we can assess the topical proximity or token overlap using various AI tools, Google Sheets/Excel integrations, or local Language Models (LLMs). This can be facilitated by asking simple prompts such as, “Rate the topical proximity of the title (column B) compared to the anchor (column C) on a scale of 1 to 10, with 10 representing an exact match and 1 indicating no relationship.”

A Leak Of Their Own

The recent revelation of Google’s ranking factors isn’t the first time a major platform’s algorithm has been laid bare:

  • In January 2023, a Yandex leak uncovered many of the same ranking factors now seen in Google’s latest disclosure. The lackluster response then mirrored the current reaction.
  • March 2023 saw Twitter unveiling much of its algorithm, though like Google’s leak, it lacked cohesive “context” among the factors. Nevertheless, it provided valuable insights.
  • Also in March 2023, Instagram’s CEO, Adam Mosseri, shared a comprehensive follow-up detailing how content is ranked across different facets of the platform.

Despite these leaks, there have been no reported instances of users or brands ethically hacking into these platforms.

The concept of platform engagement and its impact on gaming the system becomes intriguing, especially when considering the recent Google algorithm leak. Unlike platforms driven solely by user behavior, Google relies heavily on user intent signaled through searches.

Knowing the key elements shaping the algorithm is significant progress, even if the exact proportions remain elusive.

Google’s historic secrecy surrounding ranking factors is perplexing. While full disclosure might not be feasible, promoting a better web ecosystem—characterized by fast, user-friendly, visually appealing, and informative websites—could have been incentivized.

However, the ambiguity surrounding ranking criteria resulted in widespread speculation, fostering the proliferation of subpar content. This, in turn, prompted algorithm updates that inflicted financial losses on numerous businesses.

Original news from SearchEngineJournal