It’s a widely-known fact (and even a source of amusement) that technology developments occur at breakneck speed. Moore’s Law, for example, states that the number of transistors on an integrated circuit will typically double every two years. Then there’s the wry adage trotted out whenever a new TV or laptop is sold, that “it’ll be out of date before you’ve even got it plugged in”.
This doesn’t all start and end with hardware, of course, as tech software, processes and customs also change equally (if not more) quickly. As such, trends which were commonplace just months ago can soon look as though they’re achingly behind the times.
One such area where this has been especially noted is in the world of keywords.
Looking back, it was commonplace to see websites stuff their on-page content full of traffic-driving keywords, often to the detriment of its quality and readability. Some more underhand tactics even involved filling up the bottom of a web page with text that was the same colour as the background, meaning the users were unlikely to see it, but the search engines would pick up on it.
In just the last few years, however, this has all changed and keyword processes have moved on. Now, hyperlinked keywords or clearly-stuffed content looks old and rather basic.
So what changed? Well, much of it can be attributed to the work of the search engines. Relatively recently, search engines made public their view that sites should be ranked based on the value they offer each user, not the level of manipulation they’ve undergone in order to look good in the eyes of Bing or Google.
Furthermore, this came after years of investment in the search algorithms, which meant that Google et al were much better equipped to determine the sites which paid too little attention to the needs of their users.
This came to a head – for Google at least – with the roll-out of its infamous ‘Panda’ update in 2011, which brought harsh penalties on those who trotted out thin, duplicated or low-quality content. It also saw keywords morph from a handy linking tool, into something for which Google can – and did – use to punish sites which didn’t use them for the benefit of users.
Then more recently, Google rolled out its Hummingbird update, which again focused on content, albeit slightly differently. For this, Google unveiled it could now assess searches in a much more human fashion, meaning that users could input so-called “semantic searches” (such as a question or sentence), then have the correct answer returned. This meant that the days of just typing keywords with no syntax into Google were coming to an end.
On a practical level, these both meant that Google had redefined keywords – turning them from the golden ticket which needed to be utilised at every single opportunity, into something which could bring about harsh penalties. Over-use of them ran the risk of dropping out of the search rankings altogether, although turning away from them entirely didn’t sit well with those for whom keywords had formed a central tenet of their content marketing strategy.
Thankfully, what was taken away with Panda was returned (in kind) through Hummingbird. It concerned the fear that doing away with keywords would make a business harder to find.
Hummingbird, rather unquestionably, made Google much smarter and more human-like. As such, it meant that websites didn’t need to have the exact same wording as that with which people are searching, in order to be returned as a result.
To give an example, non-semantic searches for estate agents in Balham would have typically gone: ‘estate agent balham’. Then, these businesses targeting this exact search would try and force this syntax-devoid triplet into their web copy. Trying to do this without putting an ‘in’ between ‘agent’ and ‘balham’ is difficult, often leaving the sentence look clunky or amateurish.
Through Hummingbird, however, anyone searching for ‘estate agents balham’ would still be returned a site even if the on-page news copy goes along the lines of “The recent property price rises in areas such as Balham and Clapham have driven more sellers than ever through the doors of London’s estate agents.”
The future of keywords
So where does this leave keywords today? Whilst Panda may have made it seem as though they were dead or dying, Hummingbird then went and made them easier to use than ever before.
Looking ahead, it seems as if the Hummingbird approach will be the most favoured – suiting as it does all involved parties: users, website owners and search engines.
From the user’s point of view, they are continuing to get the best sites returned for their needs, not just those which have been manipulated with search engines in mind. For site owners, they don’t need to get creative with language and syntax in order to force specific phrases in where they really shouldn’t. Then, last of all, search engines continue providing the best results (thereby ensuring users hare happy with the service and aren’t tempted away to competitors), whilst also stepping ever-closer to the dream of having a computer that can answer any question posed to it – like in the sci-fi films of yore.
All this means that keywords are far from dead, even if the rigidity with which they were once deployed most certainly is. Search engines are getting smarter, which means they do not need to be spoon-fed information in tiny, hyperlinked chunks. If there is sufficient information on a website about homes in Balham or the property market therein, Google will be able to work out that this is the site of a local estate or letting agent.
It will also view the content to be of much better quality (as it’s not rammed with superfluous keywords), so will also afford it a much better quality score – and therefore a higher ranking as well.
As such, keywords are far from dead. In fact they are still incredibly useful so should be deployed sensibly – that is, when it will benefit the user and make their experience a much better one. Then, with the user on side, search engines will be sure to follow.