Search Engines and The Law of Unintended Consequences


Search engines exist to help people find information on the Internet quickly and easily. As the amount of content grows so does competition among the publishers. This article looks at how the desire to be found on the web becomes a battle between search engines and publishers.

It is a truth universally acknowledged that once a system is in place, people will find loopholes to make it work for their advantage. This is true in all areas of life: from taxes and benefits to search engine algorithms.

Of course, eventually, someone closes the loophole. But people are nothing if not adaptable and new dodges are always waiting to be discovered.

And so the system grows more complex each time the designers block a new loophole. It’s no surprise that making slight changes to a complex system is likely to have unanticipated side effects – this is the ‘law of unintended consequences’.

By definition, any system involving people is going to be pretty complex from the beginning. People are complicated, chaotic creatures, after all.

For example:

  • Overtime schemes may encourage workers to stretch the work and actually produce less overall.

But what does this have to do with the Google search algorithms? Here follows a very simplified account to illustrate the point.

The key stakeholders in Internet searches

The search ecosystem is a game of three players: Users, Publishers and the Search Engines.

Users

The average user wants to find something out – as quickly and conveniently as possible. They want to be able to rely on the information.

Publishers

Most publishers want the same thing: traffic. Whether it’s to bring in potential customers, promote their ideas or drive ad revenue, the number of visitors matters.

(There are nuances: quality of visitors is also important, but let’s keep this simple.)

Search Engines

These are necessary because of the massive volume of content on the Internet. Without a search engine the average user wouldn’t be able to find the information they need without knowing the exact address. The business model of most search engines is advertising so they also want to maximise traffic, and the best way for them to achieve this is to provide users with the best results.

So far so good. Search engines and publishers both want traffic, and users want to find the sites – what’s the problem?

People want maximum gain for minimum effort

I won’t go as far as saying people are lazy – but getting the most benefit with as little work as possible is attractive. You can either have more free time or produce more in the same time. What’s not to like?

Because the Google algorithm favours fresh or regularly updated content, this means that publishers need to keep producing more articles. If there was only a way to be able to either publish more quickly or to rank more easily…

And so SEO was born. There are two camps – the legitimate ‘white hat’ techniques that look to work with the algorithm. And the less salubrious ‘black hat’ techniques that are all about looking for loopholes in the system.

Early black hat SEO techniques (don’t try these at home!)

These are ways to game the system, pure and simple. In the really early days, it was all about keyword stuffing and scraping content:

  • Is your article all about weight loss pills? Then create a page full of text that repeats that key phrase, and related ones, over and over and set up a redirect to your article.
  • Content a bit thin? Use white text on a white background to stuff a few more key phrases in there.
  • Can’t be bothered to write anything original? Don’t worry – there’s plenty of good stuff online you can copy and paste…

Result – lots of high-ranking content with little effort. And usually of little value to the user – which was not good for the search engines’ reputations.

Enter Search Engine Page Rank

Google’s key page rank algorithm not only looked at the page itself but how many other pages were linked to it. If lots of people value the content enough to link to it, then it’s got to be good quality – right? Unfortunately this only works for as long as people don’t realise that this is what search engines are looking for to rank pages. As soon as the number of inbound links becomes a goal to aim for, you have people setting up thousands of spam sites to boost backlinks. Robots also insert spam links into blog comments. Whole businesses were set up to create backlinks and a key technique was to make contact with other website owners to request reciprocal links – no matter whether they were relevant or not.

When this happens on a large enough scale, the number of backlinks becomes meaningless as a quality indicator. And the search engines have to update the algorithm to include other criteria.

According to Search Engine Journal the pace of these changes is increasing at breakneck speed:

In its early years, Google only made a handful of updates to its algorithms. Now, Google makes thousands of changes every year.

Google Algorithm Updates & Changes: A Complete History

But it’s not just the black hat SEO techniques that drive algorithm changes. Any form of SEO makes it necessary to keep updating the algorithms.

Why? Because as soon as people understand what criteria they need to meet to show up in a search result, it will skew what they do. This takes their focus away from producing content for users and places it firmly towards pleasing the search engines.

(In a similar way, exam boards continually need to alter the way they test students’ knowledge of subjects. Schools need to show their students get good grades so teachers end up by teaching students to pass the exam rather to understand the subject.)

From librarian to guardian of content creation

This is why the more recent and sophisticated Google algorithms have put emphasis on providing the best user experience. So, search intent and site speed, together with keywords and back links, are all factors now. And the unintended consequence? Google is now using its algorithm to nudge publishers along the track of focusing on the quality of their content and sites.

In other words, the role of Google and other search engines has changed from being mere librarians to setting the guidelines for content and user experience quality. If that’s not an unintended consequence, I don’t know what is.

High quality content and user experience are all good things – no one is arguing against that. But is Google, whose business model is advertising, the best and most disinterested controller of this?