If your site has experienced a sudden and significant drop in organic traffic for a number of key search terms you once ranked highly for, you may have been hit by a Google penalty.
Here’s what typically happens when you get hit by a Google penalty:
If you’ve been affected by a Google Penalty, you’ll probably see an initial drop, followed by a sequence of bigger drops.
In this guide, we provide an overview of the some of the most important Google penalties that are still in play today.
Below is a handy table of contents that allows you to skip to a specific topic:
- Manual penalty
- Algorithmic penalty
- Symptoms of an Algorithmic Penalty
- The Google Penalties
- The June 2019 Broad Core Algorithm Update
- Google Panda
- Google Penguin
- The EMD (Exact Match Domain) Update
- Google Mobile Friendly Update
- Top Heavy Update
- Payday Loan Update
- Unnatural Outbound Links
- Major Spam Problems
- Unnatural Outbound Links
- Google Fred
- The Medic Algorithm Update
- The Macabee Update
- The Panguin Tool
There are two types of Google penalties: manual and algorithmic.
The Manual Penalty
If a manual penalty has been applied to your site, you would have received a notification from Google’s spam team within your Google Search Console account. If the team has found something they believe is manipulative or against the Google webmaster guidelines, they can impose a penalty.
The most common cause of a manual penalty has to do with spammy backlinks. In extreme cases, your entire domain may be removed entirely from Google’s index. The only way to respond to a manual penalty is via a reconsideration request, which Google must then approve before the penalty is removed.
When a manual penalty has been applied to your site, you need to remove as many manipulative links as possible. This often involves auditing your entire inbound link profile and then contacting webmasters that own sites that link to yours, and politely ask them to remove the link. If the webmaster ignores you, you may have resort to using the disavow links tool to remove the spammy links. Thereafter, you can then send a reconsideration request to Google.
Manual review penalties are usually more severe and tougher to recover from than algorithmic penalties. In extreme cases, it would be better to simply retire the domain name and start all over again.
The Algorithmic Penalty
This is where Google has automated it’s detection of any breach of its quality guidelines. An algorithmic penalty typically occurs after Google updates its algorithms. These types of penalties are typically harder to detect because you don’t get any notification from Google within your Google Search Console.
However, you can simply analyse your website’s traffic data in Google Analytics to see if your rankings (and traffic) took a noticeable dive on or around the time of a specific algorithmic update. If so, there’s a good chance your site may have been affected by the update.
Mozcast is an algorithm weather forecast by Moz that monitors Google’s algorithms or the effects of the algorithms across a massive number of sites and keywords. The sheer number of monitored sites provides the data they need to determine very accurately, if Google has released an update or not. They also forecast the extent of the updates.
Mozcast is a great tool that you can use to analyze why your site rose or fell in the search results. The hotter and stormier the weather, the businer Google’s been with algorithm updates, and the more Google’s rankings have changed.
Here’s a screenshot:
According to Mozcast, the hotter and stormier the weather, the businer Google’s been with algorithm updates, and the more Google’s rankings have changed. As you can see from the five day forecast, temperatures have been moderately high, which means Google has been tinkering with its algorithm.
Mozcast updates its results for the previous day at around 11:30 a.m. EST, so keep this in mind when you’re checking that the weather forecast is the results of the day before.
Symptoms of An Algorithmic Penalty
A specific group of links suddenly stop providing value.
If you find that some of your web pages have suddenly dropped out of the search engine results pages altogether, those pages could have been affected by a recent algorithmic penalty.
This is what happens when an entire private blog network that has been feeding those webpages with link juice has been identified as spam and deindexed. Consequently, your site no longer receives the link juice that those penalized links are providing.
Entire domain starts ranking lower for all or most of its target keywords or phrases.
This is a clear indication that the website in question has breached Google’s webmaster guidelines. In this case, every keyword will rank 30 to 50 spots below where it did before it was hit with the penalty. What this means in effect is that Google adds 30 positions to your site every time it comes up in the search results, pushing it down three or more pages.
If you notice a drop in organic search traffic that corresponds with known algorithm updates, you can be pretty clear which algorithm update has been applied to your site. For clarification, use your analytics data and compare it to Moz’s Google Algorithm Change History.
The Google Penalties
Here are some of the most important Google penalties that may have been applied to your website, in no particular order:
The June 2019 Broad Core Search Algorithm Update
A good example of an algorithmic penalty is Google’s June 2019 Core Search Algorithm Update. Some big websites such as the Daily Mail lost as much as 50% of their organic traffic with the release of the update.
Google has been using human raters to analyse its search results as far back as 2005. These raters use a 164-page set of guidelines known officially as the Search Quality Evaluator Guidelines. These guidelines are a set of instructions that Google’s human raters are required to follow when analyzing websites for quality. They are usually updated on a yearly, or thereabouts, basis.
In July 2018, Google updated its quality rater guidelines to redefine what is meant by low quality content. Google instructed its quality raters to rate a page as “Low” if any one or more of the following is present:
- An inadequate level of Expertise, Authoritativeness, and Trustworthiness (E-A-T).
- The quality of the MC is low.
- There is an unsatisfying amount of MC for the purpose of the page.
- The title of the MC is exaggerated or shocking.
- The Ads or SC distracts from the MC.
- There is an unsatisfying amount of website information or information about the creator of the MC for the purpose of the page (no good reason for anonymity).
- A mildly negative reputation for a website or creator of the MC, based on extensive reputation research. If a page has multiple Low quality attributes, a rating lower than Low may be appropriate.
For more details, see the full PDF document here.
Its important to note however, that quality raters cannot directly affect the rating of a particular website. Rather, they can only pass feedback back to the engineers who write Google’s algorithms. Once the engineers receive the feedback, Google may release an algorithm update based on the feedback, which would then affect search rankings.
Based on these guidelines, Google’s June 19 algorithmic update was applied to various sites including sites with poor quality content, sites without information about the content creator and sites with excessive ads and sensationalized, click-bait style healines.
According to Google, articles with clickbait-style headlines should now be considered low quality, regardless of the actual quality of the main content of the site. Google shed more light on clickbait headlines and articles, stating:
“Exaggerated or shocking titles can entice users to click on pages in search results. If pages do not live up to the exaggerated or shocking title or images, the experience leaves users feeling surprised and confused… Pages with exaggerated or shocking titles that do not describe the MC well should be rated Low.”
Google also made the following additions to the low quality pages and lowest quality pages sections, which shows a focus on the reputation and authority of a site’s author:
- Extensive research is required to evaluate the reputation of a content creator.
- ’Your Money, Your Life’ pages with no information about the content creator should be rated lowest.
- Unmaintained websites should be rated lowest quality if they fail to achieve their purpose due to the lack of maintenance.
- Content should be rated lowest if the creator has a negative or malicious reputation.
Over the years, Google has introduced various different algorithmic updates. Each update ends up with a name such as Panda, Penguin and Hummingbird. If your site has been affected by one of these algorithms, your rankings could drop significantly or get wiped out overnight for a period of time.
The first step to recovering from a manual or algorithmic penalty is to find out exactly which penalty has been applied to your site. SEO recovery tools make it easy for you to find out exactly which penalty has been applied to your site.
The Google Panda algorithm was released on February 24th 2011, and was written to detect and penalize low quality content on web pages. It is a powerful content quality filter that analyses the quality of an entire website’s content. According to figures from Google, Panda has sent over 40% of websites into penalty.
The upate was specifically designed to derank of sites with “low-quality” or “thin-content”. It also targets sites with duplicate, plagiarized content, user-generated spam, and keyword stuffing.
It is important to understand that Panda is the hardest of all updates to identify in terms of whether your site has been hit or not. Majority of people under the Panda penalty are not even aware they’ve been hit.
Google Panda penalizes the following websites:
- Websites with thin content. This includes sites with lots of images and video, but very little text and information.
- High volume content farms with low quality content. A content farm produces large amounts of content specifically to attract traffic from search engines and use those page views to generate easy advertising revenues. This includes large content networks with lots of low quality or duplicated content such as Squidoo and Hubpages.
- Websites with improper SEO structure.
- Sites with duplicate content. Panda not only looks for people scraping content and putting it on their site, but it also looks to see if you are gaming Google by using the same content on different pages of your site so that you can rank for different towns and cities. This also includes businesses with multiple sites that contain nearly identical content on each site and pages. Ecommerce sites are particularly vulnerable to Google Panda due to issues of duplicate content and thin content.
- Sites with excessive ads. /these inclue sites that are specifically designed to host ad-sense ads. Google devalues those sites because sites with excessive ads provide a bad user experience.
- Sites with poor grammar.
- Sites with slow loading times.
- Thin affiliate sites. This includes sites that use stock product descriptions that are used across many other sites. If you want to create your affiliate site, you have to create o that offers value and is unique. It is okay to get your statistics from the main product site, but avoid copying paragraphs an sections of the original site. Sentences should be in your own words, and the title and alt tags on each page should be unique.
- Websites with poor usability and branding.
- Over-optimized websites.
- The user experience had always been an important factor to Google before Panda, but after the update, they became a significant ranking factor! Thus, your website’s high bounce rate for a particular keyword might suggest that your site did not satisfy the searcher and that your site does not offer a good answer for that particular search query.
- Travel sites with poor or duplicated reviews.
Google Penguin was first announced on April 24, 2012. Since then, there have been 7 penguine updates. Google Penguin looks for spammy and irrelevant links. The algorithm works by analyzing the inbound link profile of every website for over-optimized anchor text.
If a backlink profile contains backlinks without branded anchors, naked URLs, or universal anchors (ie, “click here, more info, read more or here), then the link profile is heavily optimized, and the site is likely to be susceptible to a Penguin penalty.
Having non-descriptive text links such as “read more”, “Click here,” “check out this website,” and “visit us here” are great ways to keep your profile looking natural and richly diverse.
In addition, there are three main backlink factors that can be used to identify these types of link patterns:
- Link quality – Sites with a natural link profile will include both high and low quality links. Manufactured link profiles tend to have lots of just low quality links or only high authority links (like from a private blog network).
- Link growth –Sites with manufactured link profiles tend to build lots of links within a very short period. Sites that build links naturally tend to build links steadily over time. Avoid unusual spikes in link growth.
- Link diversity – Legitimate sites attract links from diverse sources (contextual, blog comments, news sites, resource sites, etc.). However, links from very few sources (such as blog comments and directories) are considered manipulative.
Issues such as unnatural link warning or bad anchor text leading to sudden ranking drops are generally attributed to the Penguin penalty. If your rankings (and traffic) took a noticeable dive on or around the time the release date of the update, there’s a good chance it may have been affected by the update.
The EMD Update
The EMD (Exact Match Domain) update was released in September 2012, and was designed to target sites that named their domains the exact keyword phrase they wanted to rank for. The biggest issue for Google was that most of these sites were spammy. These sites often had extremely thin content with little to no value on them.
These are the signals that the EMD update factors in its algorithm:
- Over-optimization of onpage content where the target keyword is contained in the domain name.
- Low quality on page cotnent, other than spammy keywords where the actual phrase is either an exact match or partial match to the domain.
- Over optimization of anchor text with an excessive use of the main keyword.
- Excessive, low quality backlink profile.
- Lack of supporting social signals.
EMD incorporates some of the Panda and Penguin penalties, so if your site has been hit by the penalty, you should focus on cleaning up the areas that both penalties target. The process of dealing with the recovery of an EMD penalty is the same or similar to that of a Penguin penalty.
The Google Mobile Friendly Update
On April 21, 2015, Google released the mobile-friendly ranking algorithm. The update was designed to boost mobile-friendly pages in Google’s mobile search results. This update primarily boosts the rankings of the most mobile-friendly sites, so if your site is not mobile-friendly, rather than being penalized, it will be pushed down in the search results.
One of the best ways to prepare is to test that Google considers your web pages to be mobile-friendly by using its Mobile-Friendly Test tool.
On top of those mistakes, here are a few more general mobile-friendly principles to keep in mind:
- Avoid software that most mobile devices can’t render, e.g, Flash.
- Use responsive design
- Use a text size that is easily readable on a small screen (typically 16px or more)
Top Heavy – Balance is the key to any impression
Google’s Top Heavy Update looks at your page layout and if it finds that the ads above the fold are excessive, it can be penalize your site and downgrade it in the search results.
According to Google’s Webmaster Central Blog when the first update came out in 2012, Google stated that they had received “complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. Such sites may not rank as highly going forward.”
This is a site-based penalty. That means that either all of your content is penalized or none of it is. Google has also confirmed that they will not penalize all sites with above-the-fold ads, but just those sites that occupy too much real estate vs. useful content in the top section of a webpage.
Google released a special tool at browsersize.googlelabs.com to help you visualize if your site may or was impacted by this.
The Payday Loan Update
Google released the Payday Loan update to identify and penalize web sites that use black hat techniques to improve their rankings for heavily trafficked search key word queries like “payday loans,” “Viagra,” “casinos” and various pornographic terms.
The update targeted spammy queries mostly associated with shady industries like super high interest loans and payday loans, porn, and other heavily spammed queries. The first payday loan update occurred in June of 2013. Payday loan update 2.0 occurred on May 16, 2014, with Payday 3.0 following shortly thereafter in June 2014.
The Pirate Algorithm Upate
The “Pirate” algorithm was released in 2012, and was specifically designed to algorithmically penalize the growing number of torrent sites that were mainly used for pirating media and software. Google took a strong stance on piracy, which is essentially stealing copyrighted content.
The algorithm works based on copyright reports. If a site has a lot of copyright violations, it will be penalized by this algorithm. While new torrent sites can be established, they will be removed from the search results each time the algorithm is run if they have accumulated enough violations.
On-page guideline violations & related notifications
This set of violations and notifications apply to issues that have been found on a site that is directly under the site owner’s control.
Major Spam Problems
If you have received the manual action notification highlighting “major spam problems”, it means Google has identified pages on the site that are entirely spammy, with no value to users. In a majority of cases, this type of manual action results in a complete removal of the website from the Google index. The major spam penalty is most often applied to sites with scraped content and/or gibberish sites.
When Google issues a notification highlighting spam problems, it means the website isn’t completely bad. It refers to a series of pages on the site that are considered thin, duplicate or low quality content. The penalty also looks at how useful and engaging the landing page’s contents are to users. This is not a site-wide penalty, and only the offending pages of the site will be penalized.
This penalty does not result in a complete removal from the Google index, but it will be much less visible in Google search results until the offending pages are removed. Once the offending content is removed, a reconsideration request must be submitted to Google.
Google is talking mainly about sites that let other people create pages or add content to their sites. User-generated spam tends to affect large, user-driven sites that have been exploited by spammers. Google issues the message as a warning to the site owners to stamp out the offending content. So, if you have a blog and your comments are unmoderated or set to auto-approve comments, or you are putting your site at risk of a Google manual penalty. In this case, Google considers the site useful but neglected.
Hacked Content Spam
The hacked content spam penalty is similar to user-generated spam in that the site has also been targeted by spammers due to lax security. Google’s will include the URL with a sample URL, which gives some ideas to the site owner so they know where to start the investigation and what type of content to check while cleansing the site of spam.
The site gets a prominent label in the search results that warns users of the possible threat if they open the website, which leads to loss of potential traffic from Google search. Submitting a compelling reconsideration request is the first step toward resolving the problem and removing the “hacked” SERP label.
Spammy Structured Markup Penalty
The prospect of getting a rich snippet is really enticing, and attempts to game the system through the use of deceptive or inflated structured data is very much on Google’s radar. If you violate Google’s structured data markup guidelines, you’ll get a notification in Google Search Console highlighting spammy structured data, and your rich snippets will no longer appear in search results.
In March 2015, Google updated its rating and reviews Rich Snippet policies, stating that these types of snippets must be placed only on specific items, not on “category” or “list of items” landing pages.
Here is an example of a manual Structured Data penalty message sent by Google in the Search Console.
The penalty message reads as follows:
Spammy structured markup
Markup on some pages on this site appears to use techniques such as marketing up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google’s Rich Snippet Quality guidelines.
A penalty can be algorithmic or manual. A manual penalty can be partial or site-wide. Google has stated:
In cases where we see structured data that does not comply with these standards, we reserve the right to take manual action (e.g., disable rich snippets for a site) in order to maintain a high-quality search experience for our users.
Recovering from this penalty requires submission of a reconsideration request, however, once you’ve been hit with this penalty, your rich snippets will no longer reappear even after Google removes the penalty.
Spam (manual) – If you’re going to play around, at least do it carefully
While most SEOs believe that spam refers solely to blasting thousands of links to a site, it’s much more than that.
The term spam, at least when it comes to manual penalties, also includes things such as:
- excessive or malicious cloaking
- scraping content
- automatically generated content
- and more.
Just like in the case of unnatural links manual actions, there are many different spam-related messages that can show up as a result of a manual action. These are the most common:
- “Pure spam.” The majority of the site is clearly spam, or the backlinks to the site are all spammed. It’s next to impossible to recover from this manual action.
- “User-generated spam.” If you have a site that allows users to submit content, you could be penalized for it if they abuse it to create spam content or links. Most commonly, this penalty refers to spam in comments or forum posts/profiles. It can be fixed.
- “Spammy freehosts.” If you’re unlucky enough to have your site hosted by the same web host that provides service to a ton of spammers, your site might be lumped together with them. This is a good reason to stay away from very cheap or free hosting services.
Since these are manual penalties, they can be fixed. Recovery usually involves either cleaning up on-site spam or disavowing spammy links.
Unnatural Outbound Links
This is a penalty issued by the Google manual actions team, and it is aimed at sites that contain patterns of “unnatural artificial, deceptive or manipulative outbound links”. This penalty is aimed at blogs that are specifically setup to sell links. Google penalizes the site by devaluing the site’s outbound links. This means none of the site it links to will get any SEO benefits.
However, because the penalty does not impact traffic, rankings, visibility, the only loser in this case is the person buying links from the site because it means a buyer could pay for a link that has no value despite its MozRank or any other external ranking metric. PageRank is the only way to detect whether a site has been penalized.
Today, there’s no way to know whether the site has been penalized unless the owner discloses this, so spammy webmasters could continue selling links from their penalized sites.
Once you see a recovery you may well still have algorithm issues for the remaining unnatural links. If you imagine that a new site starts at 0 points, your revoked site will be starting at -X points, this is a hard place to start your new climb back to success.
Thin Content With No Added Value (manual)
If Google doesn’t get you with Panda, it may hit you with a manual review for having thin content. Thin or duplicate content typically consists of duplicate content that can be found elsewhere, either on or off your site.
If a manual reviewer spots that most of your content is derived from other content, you can get hit with this penalty, and your traffic will take a tumble.
Here are the most common scenarios that represent “little or no added value”:
- Automatically generated content
- Thin affiliate pages
- Content from other sources, e.g., scraped content or low-quality guest blog posts
- Doorway pages
When you go to the Manual Actions section in Search Console, you can see whether you’ve been hit by this penalty:
Pay close attention to whether it says that it’s a site-wide match or a partial match. If it’s a site-wide match, that means the penalty applies to all your content until you fix it. If you just have a few pages of thin content, it’s possible that the penalty will only affect those. While you should still fix it, it won’t have a huge effect on your traffic.
Don’t let the name fool you. Google Fred was a powerful update, and affected sites lost as much as 90% of their traffic. The Fred update is unique is the sense that it is not a single update that has a very specific focus, or targets a single set of issues. Rather, it refers to any significant quality update released by Google that affects multiple sites but isn’t given a name.
The Google Fred Update was initially released in March 2017, and not much has been disclosed by Google about this update, and what it targets. However, analysis by industry experts of sites that got hit revealed that it is designed to hit low quality content sites that were primarily setup to generate ad revenue.
The user experience has always been a top priority for Google, and the algorithm affected sites with excessive ad placement and thin content. According to some research, sites with low quality backlinks were also targeted by Fred.
The Medic Core Algorithm Update
The Medic update was released in August 2018, and it was a very significant update that affect lots of websites across all verticals. Unlike the name suggests, it wasn’t solely aimed at medical websites, although medical, health and wellness and medical eCommerce websites seem to have been hit the hardest.
Shortly before the update, Google made highly significant additions to its quality rater guidelines. These guidelines required publishers to show themselves to be an expert in their chosen field, an authority in their vertical, and worthy of being trusted by reviewers, peers and potential customers.
Google also defined websites that “could potentially impact the future happiness, health, financial stability, or safety of users” as “Your Money or Your Life” sites. These include:
- Websites that provide medical or health-related advice or information.
- Websites that offer legal advice or information on topics such as family matters and immigration.
- Financial websites that offer advice or information about finance related matters including tax, pensions, mortgages, investments, etc.
- News websites that inform the public of important news.
Websites that process payments and allow people to transfer money or buy stuff online
Websites that provide information that can affect a user’s wellbeing, such as car safety and gas safety information and advice.
Sites with the following issues were heavily affected by the Medic update:
- Sites with old or outdated content
- Sites with content that was written by unqualified authors.
- Brands with a bad reputation.
- Sites where Google was unable to establish the website owner or find contact information
- Sites that made it difficult to find product return or refund details.
Google released the Medic algorithm around these updates. While the medical niche was heavily affected by the upate, other verticals categorized as YMYL sites that were mainly affected by the algorithm include law, shopping and finance. The update also affected ranking in within the Maps pack.
The Macabee Update
The Macabee Update was released in December 2017, and led to a loss of 20-30 percent in organic traffic for affected sites. The update targeted web pages that were manipulating long-tail keywords by creating low-quality web pages using almost identical keyword permutations.
For example, if a nutritionist wanted to rank for Antiaging superfoods, instead of writing one in-depth article on Antiaging superfoods, they would write five articles on the subject. Each article would target targeting a variation of the same long tail keyword using different permutations such as:
- “Top Antiaging Superfoods”
- “Best Superfoods to Keep You Young”,
- “Superfoods That Boost Antiaging”,
- “Antiaging Superfoods You Cannot Afford to Go Without“, and
- “The Antiaging Diet”.
The Panguin Tool
The Panguin Tool by Baracuda Digital is a free-to-use, information gathering tool that you can use to quickly and easily find out if you’ve been hit by a Google penalty. Panguin is a combination of Panda and Penguin. To get as much information as possible, you need to have had Google Analytics installed on your site for a considerable length of time. If you’ve only just installed Google Analytics, the tool is not going to give you much information.
The tool works by overlaying your analytics data with Panda and Penguin dates and allows you to see how every Google algorithm update since January 2012 has impacted your organic traffic.
When you scroll over an update, a short summary of the key points of the update are listed and you can get a better idea of how it potentially impacted your site.