Press "Enter" to skip to content

Necessary Evil : Signal Overload: How NavBoost Rewrote the SEO Playbook – NavBoost, Part Two

Views: 151

Signal Signal, Everywhere a Signal?

What we know from our experiences running websites for three decades, is that there will be surprises in the data NavBoost surfaces that will run counter to “prevailing wisdom“. It always works that way. For example, I heard 20 years ago directly from an search engine programmer (ahem, no name…they still work there), that a little bounce rate was good for Google as it drove longer engagements by users and resulted in higher Ad clicks. A little bounce rate was not only tolerable, it led directly to better outcomes for users and the search engine. That reality runs counter to any prevailing wisdom I’ve heard in all of SEO about bounce rate. The same reality will be true with NavBoost – so, lets tip toe into this and remember, the BS runs high with this topic throughout the web. Lets also remember that Google has had this data for 2+ decades to interpret, analyze, and apply that in their systems. What was true in testimony about 2005, surely isn’t true today.

Giving traffic to publisher sites is kind of a necessary evil.

– Former senior exec from Google in Bloomberg

So in this extended article we are going to run down the signals we can think of, that may be of interest to Google.

Necessary Evil: Pushing NavBoost – In Three Parts

What data do we really have on NavBoost?

  1. First occurrence of NavBoost in any document (that I know of) is in the US FTC vs Google leak from 2012. The research doc that was never published (as far as we know), but partially leaked in a redacted document by the Wall Street Journal that mentions NavBoost in passing. (pdf)
    Doc via WSJ

    That document, also includes the most clearly defined statement from Google on how clicks are used by former chief of search quality Udi Manber:

    The ranking itself is affected by the click data. If we discover that, for a particular query, hypothetically, 80 percent of people click on Result No. 2 and only 10 percent click on Result No. 1, after a while we figure out, well, probably Result 2 is the one people want. So we’ll switch it.

  2. The second occurrence happened in deposition testimony at the US vs Google Antitrust trial. Including some older email snippets. Interesting comments from apparently a Google coder (we don’t know who to attribute this to). In an email, it states that NavBoost is probably more important than the entirely of the “rest of ranking” entirely:

    For example, I’m pretty sure that NavBoost alone was/ is more positive on clicks (and likely even on precision/ utility metrics) by itself than the rest of ranking (BTW, engineers outside of Navboost team used to be also not happy about the power of Navboost, and the fact it was “stealing wins”)

  3. The Google Github API doc exposure. We have no idea how much, or if any of this is implemented.  These do not tell us a complete picture of how or what NavBoost/Glue collect. We get some of it from the docs, but it is obvious that other metrics and “sliced” data are in play.
Fact Check: Lets be crystal clear, none of those docs tell us how NavBoost is applied within the Google algo, it’s indexer, or other parts of the Google Black Box.  We also need to rachet that reality up to eleven in the All-AI-All-The-Time era. Everything said from here-onward and everything you read from SEO’s about NavBoost is at best educated guessing

Read More: For some background on this, go read Chris Silver Smith’s outstanding article on SemRush’s SearchEngineLand.com. It goes deep in the weeds about what Google said and when they said it.

One point, I will quibble about here, is that so many of the posts on NavBoost try to talk about Good Clicks vs Bad Clicks. Those are not defined any where in the available information we have. A Good Click to Google may be radically different from what we would – as webmasters and SEO’s – believe it is. The closest we can come is a mention in one doc that if a user clicks on three results, it is probably the query that is bad.

This is also true of those who are trying to say NavBoost is the primary EEAT signal.  NavBoost is not reading websites. All we we know about NavBoost is that it is ON THE PAGE metrics on the Serps. As seen above, there are timing metrics included, but whether billy-bob.com is easy to use, is not calculated anywhere in this equation.


Click & Selection Signals

Trying to gauge fresh thoughts on this topic, I took a survey last week. I posted the request to a few of my social media accounts (FB, LI). I feel the responses were all pre-qualified simply by knowing what NavBoost stands for and within my SEO sphere of social influence. I received about 110 responses (95 complete). What I found was a very diverse set of opinions.

Lets start here at sort of the bottom line for Seo’s: “Do you trust Google now that we know they probably lied about not using clicks for two decades“?


Given the split decision there – sigh –  ok fine,  you could probably set around and massage the truth as Google felt it existed! They might very well not use click tracking  directly for rankings in any traditional sense of the phrase. Whereas services such as DirectHit used pure click counts, Google may not operate that way. They may  feel that no they didn’t send Matt Cutts out for many years to search conferences to say “we don’t use click tracking”.  Seriously, to this day, NavBoost may be more of a filter serving to downgrade sites, than a pure rankings signal that gives boost to sites that preform well in the Serps. I call this  controversy #1.

Lets keep going: SEO’s are all over the map on what NavBoost does and how it does it:

 

So lets stop and  deep dive into up what signals Google (or any search engine) can read on the Serps. We know from the GitHub API, that data is available on a wide array of these data items. Others can be deduced from these data sets.

I’ve tried to categorize these in ways that make sense to the overall possible data items available. We know most of these are available to Google because of the API Github leak. However, all we have is the routine requirements. Again, this is the ingredients list without the recipe.

SERP Data Signals

Click & Selection Signals

Pure click data is the most obvious data item and the one SEO’s seem to be enamored with – so lets look at all the click data we can track:

  • Is this the first, or second click or more clicks on the page?
  • Is this the first “organic” result they clicked?
  • Total clicks on this Serp
  • Position of the click (rank 1 or …)
  • Was this the last click in the chain?
  • Did they click on a Serp feature first, second, at all? (PAA, Feedback)
  • Was this a mouse click or touch? (Touch screen, tablet, watch, or phone)
  • Rage clicks – clicks the same spot repeatedly but nothing happens
  • Dead Clicks – clicked on page in location that was not active (focus click)
Time Based Signals
It is pure guestimate what timing signals could be telling Google. This is where we really have no clue what the data is telling Google:

  • Timing: How long did the user examine before clicking (Serp dwell time)
  • Click Times: to first click – last click – between clicks
  • Bounce time / Pogo sticking – time spent on destination URL (site dwell time).
  • How long was it since the user last searched?
  • What is the mean norm of click timing between urls? How does that corelate to rankings and satisfaction?
  • When was the last time they searched?
  • Time of day recordings. morning, noon, night?
UX/UI Signals
Like time based indicators, UX/UI (User Experience, User Interface) data sets would be fascinating to look at because we have no docs or resources to tell us what user interactions could be indicating to Google. That leaves no shortage of theories about what people do on Serps, but if you have ever watched a hand full of people in eye tracking studies, or recordings of search engine usage by average people, you know you are in for surprises. In eye-tracking studies I have watched, people don’t act like general SEO’s think they act on the search engines – they are all over the map/screen in usage styles. eg: the golden triangle is dead.

  • Did they scroll – did they click on a entry or refine search?
  • Mouse/Touch/Swipe movements between clicks. Hover patterns?
  • Did they move the mouse and hoover on another part of the page?
  • Was a SERP feature (like an accordion, image carousel, shopping, map boxes, knowledge panels, and other non-traditional web results such as PAA) expanded or interacted with?
  • Local Map packs or site links interacted with?
  • What are the mouse signals? Did they move the screen around.
  • Did they scroll, swipe, pinch, or zoom?
  • Did they immediately scroll on page to get past Google spam?
  • Windows vs Mac vs Linux vs desktop vs ipad vs phone – all these users use Google differently.
Session Context Signals

I think these signals are extremely important to the overall NavBoost “metric” that Google may generate related to any give search.

  • First search in a session or refinement of a prior query?
  • Did the user refine the query after clicking?
  • Did they return to the same result later?
  • Was this click part of a pogo-sticking pattern?
  • Did they use any advanced search modifiers or filters?
  • Was this a click back from a deeper page of the Serps?
  • Were there other vertical “tabs” on the serp clicked (images, news)?
  • Are they logged in?
  • Last query abandonment? Did they issue a series of queries and then not click?
Query Source Signals

Query source signals provide context that can shape rankings before the user even hits the SERP. From GPS data to image-based searches, cross-device paths, partner sites, other sources, NavBoost may be interpreting signals we’ve barely considered.

  • Click path: What other path did they take to get here? Was this a click from the search bar, from another Google property (such as a map)
  • Did this query start on mobile or desktop? Was this sent here from either?
  • Was this in a moving vehicle or walking?
  • GPS signals and location. (work, home, other)?
  • Android / Chrome Book : Eye-tracking in Google app, or Chrome Browser (theoretical, but I’d bet money they answer is “of course they have” – who knows what is buried in the TOS’s that gives them legal cover).
  • Images: what was the last picture they took – was this circle-to-search?
Demographic and Psychographic Signals
  • Longer term user history analysis. Are they a “one off” or is this a pattern with their demo profile?
  • Is this a frequently searched item for this user (which could indicate it is their job and not a normal pattern for general public)
  • Is this consistent with previous patterns, searches, devices, and/or age/gender appropriate

Signal Noise and the Inevitable NavBoost Devaluation
Google has to have faced a major issues with NavBoost over the years starting with the advent of tabbed browsing that changed user click behavior. There have also been other technical changes to browsers, and the resulting user behavior over the years since Google started leaning on NavBoost. All those changes make me wonder if NavBoost is now a declining metric.  For example, Zero Click Serps are hard to track if you have zero clicks to track (although, the lack of a click is also a metric). Here are some of the ways that major signal noise has corrupted the NavBoost value:

    • Tabbed Browsing: Of course you have opened a Serp result in a tab. Boom, there is your long last click that adds a ton of signal noise to NavBoost.
    • Browser Spoofing: There are millions of users using third party browsers such as Vivaldi, Firefox, Opera, and Brave out there. Many of these spoof agents, open Serps in weird ways in order to be compatible with chrome. This adds noise to click signals. Additionally, many of these browsers get paid for Google Searches (how does that impact NavBoost?)
    • Search Extensions: These can work in a variety of ways, that open url results in nontraditional ways – some strip URLs and don’t send metrics to Google to track.
      Content Blockers
    • Ad Blockers: Several of these do not execute Google js link tracking code.
    • Tracker Blockers: Often included in Ad Blockers is the ability to block trackers like Google Analytics code on websites. That corrupts a strong signal Google can tie to NavBoost data. For example: 1)  user clicks on Serp result that Google data says should have GA installed on that link, 2) user never shows on website due to tracker blocker not firing GA code. That leads to more noise in the signal. The further related question, is if in order to get this data, does Google boost sites with GA installed? Pondering this as an EEAT related signal is left as an exercise for another time.
    • Privacy Actions:  This one is deep in the weeds. Remember when Google got spanked for messing with Safari cookies? So there is a deep set of signals between cookies, clicks, logged in users, platforms, locations, ISPs, an other user id signals Google could be tracking (see above). Those signals have all had to change due to legal requirements around the world. That has to add, or corrupt traditional NavBoost data.
    • Neurodivergents:  I guarantee you there are people out there – like me – that never click on what Google thinks they should click on. The only #1 result I ever click on is a navigation query. In one of Googles DOJ presentation docs, they even say: “Of course, people are different and erratic. So all we get is statistical correlations, nothing really reliable“. In another Google doc, they call this by labels such as “Psychic Pause, Unintended“, and lump these into query abandonment buckets (which they aren’t – they are simply user interactions that Google can’t explain). If we look at other docs from the trial, we see that Google is wrestling with being able to “train on the past, predict the future” in many different ways. However, there is one slide that is kinda startling: ”
      100,000,000,000 clicks provide a vastly clearer picture of how people interact with search results.”
    • AI Slop:  As we have seen in the last year, the greatest offender of EEAT would be Google themselves. They have slowly blown away 27years of Search training to redefined the search engine model in the image of Zero Click Serps. The SGE/AIO/Bard/Gemini/Whatever we are calling it today, model of search,  has thrown a classic monkey wrench into traditional Google usage. How many people directly scroll past the AI slop to get to the result, and start clicking on lower results? Or, do what people are doing – it sure isn’t finding the first result. This alone has to have corrupted the value of their traditional NavBoost data all-to-hell.
    • Zero Click Serps: Carrying on with that previous thought, the entirety of Googles new system have to have redefined what success is on the Serps.  The amount of useless annoying garbage “Google Spam” has thrown at the Serps has to corrupt NavBoost data.  Is a success now “nothing” due to AI answering the question? Or is success defined as a click to another Google property (YouTube, ads, shopping, news, etc etc)?
    • Ads: Remember when Google switched from Ads on the right, to ads at top, and then from ads marked as ads to ads marked as Sponsored, to ads with triplicate displays, to shopping, and other spammy lions/tigers/bears on the Serps? That had to all but crush traditional NavBoost metrics.
    • Bots Everywhere: Google recently shifted to requiring Javascript to use Google Search. The down side is that bots that use simple web requests from scripting languages like PHP, Python, or Perl to download Serps, no longer work. On the plus side, since newer bots need to use browser code or javascript infused code to work, they now have the means to influence NavBoost metrics (more on that later) directly. That said, bots must taint the NavBoost metrics considerably. It is anyone’s guess how good or bad Google is at filtering them off the resulting data set. Having run a site for 27 years that gets mercilessly abused by bots, I will say it is next to impossible to stop them.
    • Slicing Chaos: It must feel a bit chaotic for the NavBoost team to have to slice the data into smaller and smaller chunks. The search paradigm is entirely different than just five years ago. Imagine trying to break down NavBoost data into: Mobile vs Desktop, Query String vs Voice, and now Keywords vs Prompting. All of that is going to introduce new noise. Granted, Google has billions of users spewing searches at it daily, but you can only slice a pie so many ways until you are out of pie.

    When you combine all those changes with the changes to the Serps, it  ultimately means, their usage of 13months of data has to have been thrown out or watered down to almost uselessness:

    That ultimately means, thinking about NavBoost as a 13 month endeavor, is no longer accurate. That was torn up last year when SGE became AI Overviews and blew up the Serps.

    Another prime point is to remember that Google themselves says that “Learning from logs is the main mechanism behind ranking.” HELLO?

    To me, it is the most amazing slide I have seen since – again – Larry and Sergey’s original PageRank doc.

    We know Google has had issues here, because so many docs leaked during the trial have pointed to this area as getting multiple rankings models (Bert, Rankbrain, RankEmbed,  RankEmbed Bert, Mum, Instant Glue) are there to address short comings of NavBoost and inject more Zero Click Serp features (knowledge panels, local boxes, PAA, AIO). Those are injected either to fix short comings with NavBoost, or merely to inject money making features on to the Serp. Equation is simple: the longer you keep a person on a site, the greater the chance they click on an ad.

    Google also allows for people to support “chrome” development by sharing user data with them. That would also give us more data to look at even when a user isn’t logged into the Google.

    (If you want more education on what signals an SE (or any site really) can read on-the-page,  install Microsoft Clarity for a real old school education on the subject – it is downright frightening. For privacy fans, please note we have removed Clarity from SearchEngineWorld)

    Tomorrow: Can we Push NavBoost? hint: Of course we can.

    Here is Part One: Necessary Evil: Pushing Navboost – A Fresh Look at One of Googles Top Ranking Signals

    Views: 0

    NavBoost Links and Resources