Why You Shouldn’t Trust ‘Definitive’ Lists of Ranking Signals
As SEOs, it is our mission (and of course, our pleasure) to sit patiently in front of a screen, day after day, monitoring obsessing over our rankings. How convenient, then, that I’ve just found a bunch of “definitive” Google ranking factor lists, each one detailing “all 200” of the signals used to rank websites! That saves me a lot of work.
Except that it doesn’t. Ohh no. As the scaremongering title of this post suggests, so called “definitive”, “complete” and “exhaustive” lists of SEO ranking factors are a danger to us all. Why?
Well, firstly, Google is famously secretive about the signals it uses to rank sites, so we can’t be definite about anything. All we can do is make inferences based on our own in-depth analysis, and that of authorities on the subject such as Moz and Searchmetrics.
Even then, the trends that we encounter are correlational and not necessarily causal. In other words, all they give us is a set of characteristics shared by high-ranking sites; they don’t indicate whether or not a given characteristic is a factor in the ranking of those sites.
Another issue is context. The complexity of Google’s ranking algorithms is such that any given search can retrieve a multitude of different results pages, depending on factors like user demographics, search history and the characteristics of the device that are used to make that search.
And then there’s the transient nature of Google’s ranking algorithms. It would be wonderful if we could all just confine one of these Google ranking factor lists to memory, base all of our SEO activities around it and rank at number one for ever more.
But because Google updates its algorithms at least once a day – often very subtly – the hope of ever being able to do such a thing is nothing short of fantasy (albeit one that is shared by SEOs the world over).
When you think about it logically, if ever there was one single, definitive set of ranking signals upon which we could rely, wouldn’t we all be using them? And what then?
Okay, so can we ever be certain of ranking well in Google’s SERPs?
We can’t. But we can darn well try. Rather than following lists and chasing dreams, we need to have a whole different attitude towards SEO. We need to take a pragmatic, research-led approach that seeks to make life easier for the algorithms, rather than continuously looking for ways in which to exploit them – however tempting this may be.
Ian Lurie of Portent Inc. sums up this approach very eloquently:
- Smart, targeted content promotion
- Maximizing crawl visibility and performance
- Testing the crap out of everything
- Keeping a log of everything
- Getting content performance down to a science
So, rather than relying on Google ranking factor lists, keep up to date with all the latest research published by SEO authorities such as Moz and Searchmetrics. Always remember that these are only correlation studies, so infer what you can from them but don’t be tempted to jump to any conclusions.
Be sure to test the impact of your own SEO efforts, and base future actions on the conclusions that you draw. Every website and every target market is different, so no how much time you spend scouring the web for clues, it’s important to experiment on your own site and customers.
Of course, Google ranking factor lists can be useful as a starting point, alerting you to trends across the world wide web, but they should be no means be relied upon as your sole source of information. If you’re striving for long-term gain, you need to put the hard work in yourself, or find someone who is willing to do so on your behalf.