Carl Benedikt Frey, director of Oxford’s Technology and Employment, joins many other researchers in separating what artificial intelligence can achieve from what it cannot.
Though many fear that artificial intelligence will eradicate the need for a human workforce in the future, experts like Frey have drawn a line in the sand, claiming that there are certain areas in which human intelligence cannot be beaten by its artificial counterpart; these areas include social intelligence, perception and manipulation, and—most importantly—creativity.
Advanced robots are capable of combining old ideas to generate new ones. Although, according to Frey, they are not able to form genuinely original, valuable ideas. This includes music, poetry, recipes, and jokes.
Subjective judgement, which walks hand in hand with creativity, is another area in which artificial intelligence does not seem to excel.
In an article published on Content Bacon, the author explores the use of artificial intelligence in content creation, but concludes that though there are most certainly robots creating content—probably at this very moment—there is one element of content creation that robots can’t replicate; storytelling.
But this raises a problem.
If it is generally accepted that artificial intelligence is not capable of storytelling. Nor is it the best judge of creative works. So then why is it that we leave our online content to the mercy of the judgement of search engine algorithms?
In this article, we’ll examine how the search engine’s inability to effectively judge content quality requires it to rely heavily on other factors, pushing quality to the bottom of its list of priorities.
This has a knock-on effect on the marketing industry by creating a ranking-centric approach that dilutes, or entirely eliminates, investment in and dedication to content quality.
What Do We Mean By Content Quality?
Given that this article will discuss the unfortunate de-prioritisation of quality in content marketing, it’s important to define what we mean by ‘content quality’ before going any further.
By quality content, we mean content that is original, technically correct, well-structured, features a clear narrative or argument, and demonstrates thought-leadership.
This means that the content is not simply a rewrite of another person or business’ work. It contributes something new and valuable to a conversation and to the collective knowledge base that is the internet.
Creating quality content isn’t easy. It requires significant time-investment, dedication, and a specific set of skills.
A Means To An End
As a marketing consultancy, we speak to companies looking to grow their business and increase profit.
Overwhelmingly, the business owners we speak to have one particular ambition in mind; ranking on search engines. Understandably, most view ranking highly on Google as the key benchmark of success for their business.
Though ranking is of course crucial to visibility, this ranking-centric view contributes to what we view as an industry-wide oversight of the importance of content quality.
But business owners can’t be held wholly accountable for this development. Because it is perpetuated by an algorithm that can only pass judgement on content quality by measuring other metrics; this includes load speed, crawlability, site security, keyword density, keyword relevance, and backlinks.
The existence and prioritisation of these alternative metrics creates loopholes that allow content creators to forego the effort and investment associated with creating high quality content. These loopholes facilitate cost-savings and contribute to the growing swamp of generic, low-quality content designed to satisfy artificial intelligence over real, human readers.
The Backlinking Loophole
Although certain factors, such as number of backlinks, are likely to correlate with high-quality content, they can’t guarantee it.
There are multiple backlinking strategies that take advantage of backlinking loopholes. And these enable businesses to gain backlinks without necessarily having to create original or high-quality content.
While Google does its utmost to eliminate these loopholes, they continue to prevail in many cases. For instance, paying for backlinks is very popular. Recently, I spoke to a client who was working with an agency. The agency was sourcing backlinks for them, all of which were sitting in a link farm in the form of 300-word articles.
Despite the low quality of the content, this strategy was working for them. As such, it was incentivising them to increase their budget to ensure content quality became very challenging. And who can blame them?
The algorithm also takes into account user experience (UX) metrics, like click-through rate, time on site, and bounce rate. But again, marketers can manipulate these metrics. All without necessarily requiring content creators to invest time and effort in writing original, valuable content. Clickbait titles, for example, are an effective way to improve click-through rate.
The Catch 22 in Content
In this Catch 22 situation, our dependence on algorithms that are inherently unable to measure content quality necessitates the existence of alternative metrics like backlinks and keyword density. And the more complicated satisfying the algorithm and pandering to these metrics becomes, the further down the list of priorities content quality falls.
Where The Algorithm Falls Short
We have already examined the fundamental reasons why entrusting content quality judgement to artificial intelligence doesn’t work, but the algorithm’s shortcomings in preserving and encouraging content quality don’t stop there.
The Lack of Human Gatekeepers
Before the digitalisation of content, when far fewer factors determined the value of a piece of writing, content quality was a greater priority.
In traditional publishing, for example, the merit of a piece of work is determined by industry experts; literary agents, editors, and established publishing houses. These are the gatekeepers preserving the integrity of content creation as a craft, rather than a means to an end.
But online content functions very differently.
Online content has a different gatekeeper. Its gatekeeper is an algorithm that, though highly sophisticated, cannot compete with a human’s ability to pass subjective judgement on creative works. Afterall, a main design element of search engine algorithms is to crawl an incomprehensible amount of content.
The Content Duplication Loophole
According to Moz, up to 29% of the web is duplicate content.
And this figure doesn’t even take into account duplicate ideas.
The Google algorithm is able to recognise exact matches in content. An exact match might be a duplicated phrase, sentence, paragraph, or even an entire article. But what Google cannot pick up on is the duplication of ideas.
Provided that an article has been re-written in such a way that avoids the use of exact match phrases, Google is unable to recognise that the ideas within the article are not original.
This means it is easy to mimic the exact structure of an article; with the same narrative and order of ideas, without being flagged as a duplicate. And providing that the sentences are rewritten, articles like these slip under the radar.
This practice isn’t even discouraged in many cases. In fact, outsourcing industry-specific content writing to copywriters typically requires them to source their ideas from already existing content. Given that they may not have personal knowledge of the subject matter, this is pretty unavoidable.
However, it is worth noting that a good copywriter will compile ideas from multiple sources, rather than mimicking the narrative of a single blog post.
What Does This Mean For Content Creation?
It’s simple really. To avoid the duplication of ideas, a writer would have to be proficient enough in the subject matter (and have enough time and determination available to them) to posit their own theories, without leaning heavily on the ideas of others.
And herein lies the problem…
For business owners, this level of expertise and time-investment on behalf of a copywriter is expensive to source.
Of course, this is exactly how content creation functions in the book-writing world. If I were to rewrite Jane Austen’s Pride and Prejudice, for example (and call it Ego and Bias), it’s fairly safe to assume that no one would publish it. And if anyone ever did read it, they’d probably be pretty miffed at me.
To succeed in writing books, poetry, or screenwriting, the exact replication of ideas won’t get you very far. Sure, you can draw inspiration and you can re-work already existing ideas, but to duplicate them exactly would be to ruin your work’s chances of success.
Sadly, the world of online content lacks the regulation necessary to prevent the duplication of ideas.
And because of how much faster it is to reproduce something than to create something original, it’s no wonder that such a huge amount of content ideas suffer from duplication.
In practice, this means that bigger businesses with a greater presence can easily replicate the work of a small business. As an established business with a stronger SEO profile, their article is likely to rank higher, despite not being the original source of the ideas.
The state of online content creation
Given the state of content creation online, is it any wonder that fewer and fewer businesses are putting in the effort to create something entirely new and valuable?
Even if they were to, what guarantee do they have that their hard work will pay off? Or that anyone will even see their article? Or that it won’t be replicated by someone with greater visibility?
This brings us onto the next reason why the algorithm is killing content creativity; the unfair playing field.
The Unfair Playing Field
Established businesses have a massive advantage over new businesses, as we have already explored in the previous point on duplication. But the ability to duplicate content ideas with impunity is only a small part of a much wider problem.
In the article, Internet Monopolies Should Be Owned By Us All, the author acknowledges that: “The internet has been a virtual wild west, fast-moving and little regulated.”
In traditional business environments, we take measures to prevent companies having a monopoly in their industries. One of the primary methods of preventing monopolies involves removing or lowering barriers to entry.
However, the online world is largely unregulated. Anyone can produce and post content, which means that technically the barriers to entry are practically non-existent, but this also means that “entry” has become practically meaningless.
Simply posting content doesn’t mean you’ll have any visibility or that you’ll be able to reap any rewards from that content.
The nature of the Google algorithm means that only a select few businesses are able to rank.
Only a fraction of businesses make it to page one of Google for any of their keywords. And yet the first page of Google captures between 71% and 92% of web traffic.
Credibility and authority are among Google’s key metrics for determining ranking. Which means that established businesses can bury the competition with minimal effort. And the possibility of a competing business breaking through is slim.
Businesses with strong SEO could write exactly the same article as a new business and rank number one on Google. All the while a new business’ article might never be seen by anyone.
To translate this into non-digital marketing efforts, let’s imagine that a new business and an established business both pay for a local billboard advertisement. They pay the same amount of money and achieve exactly the same visibility; as in, the same number of individuals see each of the billboards.
Online, the same cannot be said.
Though the same amount of effort and resources will go into creating an article for both businesses, the new business will struggle to achieve even a tiny fraction of the visibility achieved by the established business.
The Content Churn
It’s largely businesses with a significant budget or in-house capacity that practice content churn. This practice involves pushing out as much content as possible, to saturate the online marketplace.
Content churn, by its very nature, serves the purpose of maximising output with minimal effort.
This means that copywriters participating in this activity often duplicate content ideas (re-writing existing articles so Google doesn’t flag them as plagiarised).
To keep time investment to an absolute minimum, content that is churned out in this way is rarely original and often has very little to contribute.
Because the internet is mostly unregulated, content churn can be very effective.
With ample investment in SEO and backlinking, businesses can afford to push out low-quality content without compromising their ranking. All because the algorithm will recognise them as an authority.
This means that businesses with sufficient resources can effectively monopolise web content, which is very problematic for small businesses.
It also means that people using search engines are becoming increasingly frustrated. It actually becomes harder for them to find answers to their questions. Or to locate original content that demonstrates genuine thought leadership.
Does Content Quality Even Matter?
In an online marketplace where visibility and ranking is prioritised above all else, what does the future look like for content creation?
In a system that discourages creativity, risk-taking, and originality, can we really blame businesses for taking the easy route; whether with content churn or duplication or backlinking loopholes?
Though ranking may feel like the end goal, as an objective every business is fighting tooth and nail to achieve, it is important to remember that ranking does not mean conversion. It does not automatically mean sales or profit or growth.
Ranking is simply visibility, and visibility can only achieve so much.
But what happens after you get seen? Once you feel that you have satisfied the algorithm, how then do you go about satisfying your human visitors?
Content quality marks the difference between:
- Others seeing you as an innovative thought leader in your industry
- Others disregarding your content as ‘more of the same’
To stand out, you have to look much further than the first page of Google.
What’s The Solution?
As far as we’re concerned, the ranking algorithm is doing its very best to compensate for a fundamental insufficiency; it can’t master the human gift for subjective judgement.
We believe, as do many researchers, that in this area, artificial intelligence will never be able to compete with humans.
So where do we go from here?
How do we change the trajectory of the marketing industry and revive commitment to content quality?
It starts with you.
It starts with every business owner and content marketer choosing to put quality content first and ranking second.
And it starts with marketing strategies that utilise content to its full potential.
By investing more effort, resources, and funds into the creation of content. And by seeing it as more than a means to an end, businesses can not only safeguard their long-term conversion ambitions, but change the industry from the bottom up.
Overtime, by putting out valuable, thoughtful content, businesses will build a true following. One based on long-term authenticity, rather than short-term spin.
After all, it’s a lot more satisfying to perform to a human audience than a robot one. If you’d like to attract more than Google bots, and truly connect with your audience then get in touch for a free, non obligation chat.