In 2016, Google’s Andrey Lipattev revealed that content is one of Google’s top 3 most important ranking factors in the Google Search algorithm. This means that the quality of your content now plays a key determining factor in how high you are able to rank in the search results. This is despite the fact that the algorithm considers over 200 ranking factors when ranking websites. The Google Panda algorithm, Google Medic, Google Fred and the June 2019 algorithm Broad Core update were all based around the concept of content quality.
Today, the quality of your website’s content can make or break your organic visibility in Google.
In 2015, Google released a 164-page set of guidelines known as the Search Quality Evaluator Guidelines, which was a complete update to the existing guidelines. This provided powerful insights into what Google deems to be quality content.
In these guidelines, Google introduced the concept of E-A-T, and stated that demonstrating a high level of expertise, authoritativeness and trustworthiness (E-A-T) was one of the most important characteristics of high-quality content.
Raters are now required to assess sites in these three key areas when evaluating the quality of a web page’s content. This means that if your content has a low E-A-T score, it is unlikely to rank on the first page of Google,especially if your site is considered a YMYL site.
The Google Panda algorithm update is essentially a content quality filter that analyses the quality of an entire website’s content. It lowers the site’s ranking in the search results if it detects a certain amount of poor quality content on the site. The main objective of the algorithm is to ensure that only high-quality webpages are ever featured on the front page of Google’s search results.
Google Panda is was a direct response by Google to the proliferation of spammy, low quality content on the search results pages. It was also a significant change from the way Google ranked websites.
The update was named after Navneet Panda, one of the more influential engineers in that particular update. The aim of the update was to ensure that only relevant, high quality sites were ever featured on Google’s front page for any search query.
When analyzing content, the algorithm looks at a number of factors including the following:
- How comprehensive or in-depth is the content?
- How useful or valuable is the content to readers?
- Does the content solve a specific problem?
- How fresh is the content?
- How original is the content?
- How often is the site’s content updated?
Google Panda is the algorithm update that changed the way webmasters publish content on their websites. Before Panda, webmasters were able to dominate the first page of Google’s search results by publishing tons of useless, highly spun, keyword-optimized content to take advantage of popular searches.
According to Google, the first iteration of the update affected 12% of search results in the US, and there has been 1 update every 1 – 2 months since the release of Panda 1.0. In the three years that followed its release, Google Panda was a rolling monthly update. Now it appears to run only a few times a year.
This means that if your site has been hit by the algorithm at one time or another, you can probably recover any lost search ranking by identifying and fixing the issues before the next iteration of the update is released.
Keeping Google Panda at Bay
In order to effectively protect your site from being targeted by the algorithm, it is useful to have a clear understanding of Google’s mind set and why this update was necessary. As you’re no doubt aware, Google is the number one search engine, and its mission is to keep their users happy and satisfied so they remain number one.
They do this by making sure the front page of their search results for any given keyword only contains links to relevant, high quality webpages. These are pages that will most likely have the answers that a particular searcher is looking for.
In a nutshell, Google wants webmasters to focus on delivering the best user experience possible so they can send users to the most relevant webpages with the highest quality. When this happens, users immediately get what they want or need and continue to rate Google as the number one search engine on the web.
If you’re doing business on the web, it is critically important to understand what makes a site vulnerable to Google Panda.
Thin and Low Quality Content
Google Panda was aimed primarily at websites that offered little in the way of original or meaningful content. In 2014, Google added some clarity to what they meant by “thin and duplicate content” and changed it to “thin and duplicate content with little or no added value”.
Here’s what it is not quality content for the purposes of Google Panda.
Many of the websites that got hit by a manual rather than alogorithmic penalty by Panda in 2011 frequently published thousands of short form articles in a bid to get a ranking boost from the Google Freshness Algorithm. This algorithm was primarily designed to boost the organic search ranking of sites that consistently post fresh content. However, the problem was that most of the “fresh” content published by businesses that wanted to take advantage of the algorithm, provided little or no value because a lot of these articles were actually heavily spun variations of the original.
For example, if a business was targeting the popular keyword Antiaging superfoods, they would write one main article such as “Top 10 Antiaging Superfoods”. They would then publish lots of heavily spun variations of that article, changing just the headlines using long tail keywords such as “Best Superfoods to Keep You Young”, “Superfoods That Boost Antiaging”, “Antiaging Superfoods You Cannot Afford to Go Without“, and “The Antiaging Diet”.
As you can probably guess, the overall theme of each and every one of these articles is exactly the same. In other words, the additional articles all solved the exact same problem as the original article and therefore added no real value. A lot of high ranking websites who relied on this strategy to rank high on Google were hit hard by Google Panda.
The fact of the matter is, even the most sophisticated spinner software cannot create an original article. If they all solve the same problem, then they do not provide any additional value beyond what the original article provides.
This is the type of content strategy that gets businesses in trouble with Google Panda. Unfortunately, many websites today still rely on this type of flawed content strategy as the basis of their link building strategy. There’s no doubt that Google has raised the quality bar over the last few years, and your content needs to be unique, relevant, informative and of the highest quality in order to rank on Google’s first page. Keep in mind that Google now uses Artificial Intelligence in its ranking algorithm, so it is no longer as easy to game the system as it was in the past.
Building Links to Bad Content is a Bad Idea
If you’re experiencing low organic search rankings, don’t automatically assume that you need more links. Yes, quality backlinks are important to SEO, but if you’re generating links based on this type of content strategy, your overall SEO strategy is flawed.
This is because Panda is a site-wide penalty, not a page penalty. This means that if a site has just a certain amount of poor quality content, then the entire site falls below Panda’s quality algorithm, and the whole site is filtered out of the top ten.
If you have lots of short form articles or are spinning the same article hundreds of times in the belief that you’re creating unique or original content, this could be what is standing in the way of a top search ranking for your target keywords.
Google promotes Panda as a filter rather than a penalty. This means if you’ve been hit by the algorithm, your site will simply be filtered out of the top search results, and you won’t even realize you’ve been penalized. You simply won’t get to the front page, no matter what you do.
If you have a specific type of thin content on your site in the form of doorway pages, then you are bound to get hit with a manual penalty based on the Google Panda algorithm. EBay was one of the top brands that got hammered by this penalty because the entire site was built on the concept of doorway pages.
At the time, EBay lost as much as 80% of their organic traffic which cost them an estimated $200 million in lost revenue.
Not Ranking? Look Closer to Home
If you’re stuck on page 3 of the Google search results and can’t seem to move any higher no matter what you do, you may have been filtered out of the top ten search results by Google Panda. Does your content strategy involve publishing a lot of short form or highly spun articles?
Note that unless you have a certain amount of thin content, you’re not going to get a notification or manual penalty from Google. You just won’t rank high no matter what you do. Instead of building links, start by performing a content audit to identify whether or not your site does have a lot of what Google defines as thin, low quality content that provides little value.
How to Identify Bad Content on Your Site
As has already been established, content is the most fundamental ranking factor, and the quality of the content on your site will determine how high you rank in the search results. If you are experiencing low rankings, a content audit should be the first thing you should consider doing.
Even if you have unique, high-quality content on most areas of your site, low-quality content on sections of your site can impact the entire site’s ranking as a whole. If you cannot remove bad content, you can use your meta robots tag or an entry in your robots.txt file to block the crawling and indexing of duplicate or low-quality content to prevent them from affecting the ranking of the entire site. Note however, that Google doesn’t recommend using the robots command in this manner.
Poor-quality content scope signals could include:
- Pages that don’t appear to align with the purpose of other pages
- Irrelevant content
- Less than extensive information
- Any form of duplicate content
- General information
- Pages with no purpose
- Auto-generated content
- Pages with the sole purpose of advertising or affiliate income
- Heavily spun content.
The fact of the matter is, if you have poor quality content, Google will categorize your site as a low quality website, and it will be practically invisible in the search results. It is therefore critically important to identify and weed out pages that might be deemed as “low quality” in the eyes of Google Panda. And the most effecient way to do this is through a content audit.
The Buzzsumo Study
In 2015, Buzzsumo teamed up with Moz to analyze shares and links of over 1 million articles. The study found that the majority of content published on the web attracts very few links. In fact, over 75% of a sample of 100,000 blog posts had zero links.
This suggests that the majority of businesses are just not investing in the creation of content that can acquire links naturally. This presents a powerful opportunity for companies that are willing to invest in quality content to dominate their space.
What is a Content Audit?
A content audit is a comprehensive analysis of the quality of your existing content in order to find out whether it is achieving the goals you want or actually damaging your chances of ranking high in the Google search results.
An audit will answer the following questions about your content:
- does it provide value for your target audience?
- is it generating the right kind of traffic to your site?
- is it generating conversions for your brand?
- does it inspire confidence in our brand?
- Is it relevant?
- Is it duplicate content?
- Is it accurate and consistent?
- is it credible and trustworthy?
- are visitors engaging with the content?
- which of your pages have a higher-than-normal bounce rate (70-100%)?