Google's September 2019 update fits its longer term strategy

What Google’s September 2019 update was really all about

November 7, 2019

Google’s latest algorithm update in late September left many people scratching their heads but when you consider it alongside the search giant’s main strategy everything becomes clear.

First let’s run down what a great many webmasters experienced … reduced visitor numbers. But this wasn’t the usual swipe at spammers or the loss of prized keyword rankings.

The keywords where websites lost their rankings were mostly in the ‘gray tail’. These are all those keywords that don’t get searched much and don’t deliver much traffic on their own but when you put them altogether they add up.

As a general rule of thumb a website gets 80% of its organic traffic from 20% of the keywords it ranks for, flip that the other way around and that means 20% of your organic traffic is coming from 80% of your keywords.

Its that 80% that seems to have been hit hardest. In fact many websites reported their traffic down by 20-30% while at the same time confirming they hadn’t lost their ranking positions on their targeted keywords which delivered the most visitors.

Why is Google targeting the gray tail?

Its a first step and there will be more of this to come. This is Google simply testing out technology that it intends to roll out.

To understand it here’s an example. Let’s say I wrote a blog post like this one about Local SEO and in it I mentioned ‘pile of pants’ a few times. It is a phrase I often use when referring to some SEO myth which isn’t true.

Now let’s say the post went viral and CNN, The New York Times, the BBC, Search Engine Journal and a whole bunch of other mega sites linked to the article. It’s going to bubble to the top of Google very quickly for Local SEO related searches.

But its going to bubble to the top for all sorts of searches related to the other content of the page which means phrases like ‘pile of pants’.

Of course that’s wrong – anyone searching ‘pile of pants’ is probably an English language learner looking to find out what it means or someone who wants to discover where the phrase came from, they’re not interested in my ramblings on Local Search Engine Optimization.

Enter Machine Learning and Artificial Intelligence

Google is aware of this problem and it is also aware that its strategy of using links as a major factor in deciding who ranks where is becoming increasingly faulty. Content can go viral and be incredibly popular without links being created due to our ever growing use of social media and other online platforms.

Google’s answer is Natural Language Processing (NLP). Just understand what a web page is about and figure out if this is the best piece of content on the internet about that subject without relying on backlinks.

In other words being able to figure out that my Local SEO blog post has nothing to do with ‘piles of pants’ and so not ranking me for it even if the article has the best backlink profile on the web.

That’s exactly what this update was about. Understanding content better and discounting the role of backlinks.

Now make no bones about it. This has been tried before. In fact it was the basis of how most search engines worked at the turn of the millennium and it didn’t work that well because it was easy to game.

But the computing power that wasn’t there back then (to really understand content) is now starting to make its debut. NLP will make it possible to understand the content of a page without the klunky old techniques of keyword density and such like. It will be able to discern a quality piece of content from a $50 article better and better over time.

NLP is in its infancy but Google’s latest update shows that they are already feeling confident enough to start using it in some areas. The feedback it collects from this roll out will generate further data for the Machine Learning algorithms to refine the technique further before it is rolled out again to encompass ever more popular keywords an search terms.

So how did link2light fair?

The majority of link2light clients saw ranking improvements (and none saw ranking drops) because we have been working on page content relevancy for the last decade as a key part of what we do while many SEO agencies have become blinkered by the endless pursuit of links.

As Google pulled the plug (or perhaps just eased the plug out a little) on links as a major signal for ‘gray tail’ keywords that has meant the more relevant pages we have made for our clients were moved into the void.

Personally I expect to see more and more of this over time which makes it essential for every webmaster to understand NLP and then go to work creating exceptional and comprehensive content but in a way that balances this information with usability – especially for eCommerce and Local Business websites.

This means ensuring pages about certain subjects, products or services contain all the information Google will expect to find in content about those subjects, products or services but without cluttering up the page so that visitors and potential customers can’t see how to subscribe to a blog, book a service or buy a product.

Leave a Reply

Your email address will not be published. Required fields are marked *