Generate More Leads With Long Tail Keywords

Long Tail SEO

Long tail SEO involves optimizing your content around long tail keywords. Long tail keywords are longer and more specific keywords or phrases of three words or more that your prospects are more likely to use when they’re getting closer to the point-of-purchase or when they’re using voice search.

Searchers that use long tail keywords know exactly what they’re looking for and tend to be at an advanced stage of the buying funnel. For example, a user that enters a specific search query such as black ralph laurent polo shirt with big pony is more likely to have more commercial or buyer intent than a user that simply searches for polo shirts.

Shorter keywords are also a lot harder to rank for, which is why optimizing for the long tail keyword gives you a better chance of appearing at the top of an organic result than if you optimize for the ultra-competitive “polo shirts“. There are hundreds of polo shirt brands. If a user simply types in “polo shirt”, if you’re running an AdWords campaign, you may not want to bid for this keyword because you may not carry the particular polo shirt the user is searching for, leading to a wasted click, enough of which can become very expensive.

In fact, you may want to add such keywords as exact match negatives. This is because you want to make sure that the traffic you’re paying for has the highest chance of converting. The more information in a search query, and the longer tail it is, the more we can learn about the commercial intent of the keyword.

The fact of the matter is that long tail keywords almost always have a better click through rate, lower cost per conversion and a better engagement on site than broader, short tail keywords.

Check out this video from Moz’s Rand Fishkin on long tail keywords:

Furthermore, once you’re ranking for the long tail keyword, you will naturally begin to rank for related and much shorter keywords. So, even though you might initially be attracting less traffic by optimizing for long tail keywords, the traffic you’ll be attracting to your site will be much more targeted, focused and more commercially driven, delivering much better return-on-investment (ROI) than shorter keywords.

According to Neil Patel, using long tail keywords as anchor text and titles or subtitles in your blog give you a ranking bost over simply including them in the body.

Here’s an infographic with information on the value of long tail keywords:

long tail keywords

Source

Broad and Long Tail Keywords

Broad search queries are typically used by searchers who are at the beginning of the buying cycle. Searchers who use broad keywords are usually just starting out their research. These search terms are usually searched on much more often than specific keywords. However, conversion rates tend to be low for these types of keywords because searchers are typically at the awareness or research stage of the buying funnel. You would typically use broad keywords if you have branding goals in mind, and are more interested in getting as many eyeballs to your site as possible rather than converting traffic.

Examples of broad keywords and phrases:

“greeting cards”
“kids toys”
“used cars”

On the other hand, searchers who use long tail keywords in their search query already know exactly what they want and are in a more advanced phase of the buying funnel. These keywords are not searched on that often and may not be as popular as broad keywords. However, conversion rates tend to be higher because searchers that use them are usually ready to buy. You would use specific keyword phrases if you are more interested in conversions than getting as much traffic to your site as possible.

Examples of long tail keywords and phrases:

  • How much does a vintage christmas greeting card cost?
  • Where can I buy vintage christmas greeting card online?
  • What does a vintage Christmas greeting card cost?
  • Where is the best greeting card store closest to me?

When creating long tail keywords, you’ll typically start with one or two-word keywords or phrases, also known as “core” or “seed” keywords. Core keywords typically have lots of search traffic and are far too competitive to optimize for.

For example, if you are trying to write content for a company that sells educational toys, your core keyword would be something like “toys”. However, optimizing for such an ambiguos word would be a really bad idea because it would be very difficult to figure out the intent of the searcher, as people who use such words may have all sorts of motives. Furthermore, searchers who use such broad words are typically at the beginning of their research.

On the other hand, a more descriptive and specific long tail keyword such as “educational toys for three year old girls uk” clarifies the intent of the searcher. The searcher is looking for educational toys for three year old girls, and is based in the UK. People who used these types of keywords have already completed their research and are more likely to convert.

It is not difficult to come up with more specific, longtail keywords. However, how many people are searching for this keyword? Even though the search term might be relevant, if you are simply speculating on what people are typing in to search for your products or services, you’ll be guessing and could be making a big mistake. This is because the actual number of people searching in that context could be too low to bother optimizing for the particular search term.

Keyword Targeting

Targeting long tail keywords simply means choosing a particular long tail keyword, and then writing content to address the user intent behind it. In other words, you’re focused on understanding the intent of the user that uses that search term in their search and what problems or challenge they are experiencing. Targeting these keywords in your content will attract engaged visitors who are interested in what you’re selling.

Developing a content marketing strategy around long tail keywords that are used to search for your products and services by your target audience is key to increasing your organic search ranking for your target keywords, and ultimately attracting the right traffic that converts on your site’s goals. If you are still using the wrong keywords to optimize your site, you won’t reach or attract the right target audience.

Increased Conversion Rates

Searchers who use long tail keywords when looking for information tend to have a higher conversion rate than other visitors. Generally, the longer the search query, the less search volume and, thus, less competition. Individually, long tail keywords may not account for a lot of searches, but when taken together, they tend to attract lots of relevant and likely to convert visitors to your website than the more general keywords such as “flowers“, “roses“, “mens shoes” and “leather jackets”.

The natural benefit of targeting specific keywords consisting of three words or more is that when you start to rank and get traffic for the longer keyword phrases such as “where is the best book store near me,” and “which book shop is open right now”, you will naturally begin to rank better for shorter versions of the keyword phrases, such as “book store,” “book shop,” and “buy books.”

Semantic Search

Google’s latest algorithm, the Hummingbird, has made the development of content that speaks directly to your target audience and directly answers specific questions they might be asking, an absolute priority. In order to rank your content on Google search, rather than finding out the best keywords for your website or web pages, it is much more important to figure out the many different questions your potential customers may ask, and provide subject-relevant, meaningful and detailed answers that specifically and directly answers those questions.

It is now extremely important to anticipate the many different questions your customers and potential customers may have, and deliver insightful, meaningful solutions to those customer questions across social platforms including your blog, Google+ and other relevant communities. If you are highly focused on addressing the specific needs of your audience, your content is more likely to rank according to the Google Hummingbird algorithm.

Essentially, long tail keywords are probably some of the simplest types of keywords to optimize for, mainly because you will not see as much competition for these types of keywords as you will for shorter, more competitive search terms. In addition, because the searcher is using such specific terms rather than more generic keywords, it is very likely that they have completed their research and are now ready to buy.

Lower Advertising Costs

Long tail keywords are even more valuable for users who are running paid search advertising campaigns because the cost-per-click is lower due to less competition. In fact, you can get higher ad placements and much cheaper clicks.

The Definitive Guide to Google Penalties

If your site has experienced a sudden and significant drop in organic traffic for a number of key search terms you once ranked highly for, you may have been hit by a Google penalty. According to Matt Cutts – the former head of Google’s web spam team, there are over 400,000 manual penalties that are applied to websites that violate Google webmaster guidelines every month and are deemed egregious enough to trigger sanctions by the Google Search Quality team.

There are two types of search engine penalties as discussed here by Matt Cutts, the former head of Google’s spam team.

Manual penalties

If a manual penalty has been applied to your site, you’ll receive a notification from Google’s spam team within your Google Search Console account. If the team has found something they believe is manipulative or against the Google webmaster guidelines, it can impose a penalty.

In most cases, manual penalties are caused by off-site factors such as spammy backlinks. In extreme cases, your entire domain may be removed entirely from Google’s index. The only way to respond to a manual penalty is via a reconsideration request, which Google must then approve before the penalty is removed.

Manual Review Penalty Symptoms

Manual review penalties are usually more severe and tougher to recover from than algorithmic penalties. In extreme cases, Google recommends to retire the domain name and start all over again. When a manual penalty has been applied to your site, you need to respond by removing as many manipulative links to your site as possible.

It often involves auditing your entire inbound link profile and then contacting webmasters that own sites that link to yours, and politely ask them to remove the link. If the webmaster ignores you, you may have resort to using the disavow links tool to remove the spammy links. Thereafter, you can then send a well-written reconsideration request to Google.

Algorithmic penalties

An algorithmic penalty typically occurs after Google updates its algorithms. These types of penalties are typically harder to detect because you don’t get any notification from Google within Google Webmaster Tools. However, you can simply analyse your website’s traffic data in Google Analytics to see if your rankings (and traffic) took a noticeable dive on or around the time of a specific algorithmic update. If so, there’s a good chance your site may have been affected by the update.

The most popular algorithmic updates are Google Panda, which punishes on on-page infractions such as poor content quality, over-optimization of specific on-page elements, etc. and Google Penguin, which is mainly focused on your backlink profile and anchor text distribution.

Following are symptoms of an algorithmic penalty

A specific group of links suddenly stop providing value. If you find that some of your web pages have suddenly dropped out of the search engine results pages altogether, those pages could have been affected by a recent algorithmic penalty.

This is what happens when an entire private blog network that has been feeding those webpages with link juice has been identified as spam and deindexed. Consequently, your site no longer receives the link juice that those penalized links are providing.

Entire domain starts ranking lower for all or most of its target keywords or phrases. This is a clear indication that the website in question has breached Google’s webmaster guidelines. In this case, every keyword will rank 30 to 50 spots below where it did before it was hit with the penalty. What this means in effect is that Google adds 30 positions to your site every time it comes up in the search results, pushing it down three or more pages.

If you notice a drop in organic search traffic that corresponds with known algorithm updates, you can be pretty clear which algorithm update has been applied to your site. For clarification, use your analytics data and compare it to Moz’s Google Algorithm Change History.

Identifying Search Engine Penalties

The first step to recovering from a manual or algorithmic penalty is to find out exactly which penalty has been applied to your site. SEO recovery tools make it easy for you to find out exactly which penalty has been applied to your site.

Manual on-page penalties

This set of penalties are applied to sites with issues discovered on the site itself. Webmasters with these types of penalties get a message in the Search Console.

Thin Content Penalty

In 2013, Google introduced the thin content manual penalty. This penalty is designed to de-index websites that create content with little or no value. This type of content includes long-form, keyword-rich articles that provide no real value, and are primarily designed to rank for specific keywords. If Google’s algorithms detect that a particular page’s content is keyword-stuffed, duplicated or has a very high bounce rate, it will be categorized as thin content.

It is important to note that once thin content is detected on a website, the entire site could be penalized and removed from the search results. To be re-included in search results, you’ll have to identify all of the content that can be categorized as thin content and replace this content with original content.

Here are examples of thin content as defined by Google:

  • Automatically generated content
  • Thin affiliate pages
  • Scraped content or low quality blog posts
  • Doorway pages

In this video, Google’s Matt Cutt clarifies what is meant by thin content:

If you receive a message that the penalty is sitewide, it means that Google considers the entire site in violation of its quality guidelines, and the whole site is penalized. A partial match penalty means only a portion of the site is considered in violation.

For more info on thin content, check out this article by Moz’s Dr Pete.

Major Spam Problems

If you have received the manual action notification highlighting “major spam problems”, it means Google has identified the site as entirely spammy, with no value whatsover to users. In a majority of cases, this type of manual action results in a complete removal of the website from the Google index. The major spam penalty is most often applied to sites with scraped content and/or gibberish sites.

Spam Problems

When Google issues a notification highlighting spam problems, it means the website isn’t completely bad. It refers to a series of pages on the site that are considered thin, duplicate or low quality content. The penalty also looks at how useful and engaging the landing page’s contents are to users. This is not a site-wide penalty, and only the offending pages of the site will be penalized.

This penalty does not result in a complete removal from the Google index, but it will be much less visible in Google search results until the offending pages are removed. Once the offending content is removed, a reconsideration request must be submitted to Google.

User-Generated Spam

User-generated spam tends to affect large, user-driven sites that have been exploited by spammers and Google issues the message as a warning to the site to stamp out the offending content. In this case, Google considers the site useful but neglected. The message generally includes a sample URL where user-generated spam has been detected. Consequently, the entire site isn’t penalized.

Hacked Content Spam

If your site has been hacked due to poor security, Google will hit your site with the hacked content spam penalty. In their message to you, Google’s will include a sample URL, which will give you an idea where to start the investigation and what type of content to check while cleansing the site of spam.

The site will also get a prominent label from Google in the search results that warns users of the possible threat if they open the website. This will lead to loss of potential traffic from Google search. Submitting a compelling reconsideration request is the first step toward resolving the problem and removing the “hacked” SERP label.

Spammy Structured Markup Penalty

The prospect of getting a rich snippet is really enticing, and attempts to game the system through the use of deceptive or inflated structured data is very much on Google’s radar. If you violate Google’s structured data markup guidelines, you’ll get a notification in Google Search Console highlighting spammy structured data, and your rich snippets will no longer appear in search results.

In March 2015, Google updated its rating and reviews Rich Snippet policies, stating that these types of snippets must be placed only on specific items, not on “category” or “list of items” landing pages.

Here is an example of a manual Structured Data penalty message sent by Google in the Search Console.

The penalty message reads as follows:

Spammy structured markup

Markup on some pages on this site appears to use techniques such as marketing up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google’s Rich Snippet Quality guidelines.

A penalty can be algorithmic or manual. A manual penalty can be partial or site-wide. Google has stated:

In cases where we see structured data that does not comply with these standards, we reserve the right to take manual action (e.g., disable rich snippets for a site) in order to maintain a high-quality search experience for our users.

Recovering from this penalty requires submission of a reconsideration request, however, once you’ve been hit with this penalty, your rich snippets will no longer reappear even after Google removes the penalty.

Unnatural Outbound Links

The unnatural outbound link penalty was issued by the Google manual actions team in April 2016, and it was designed to penalize sites that contain patterns of “unnatural artificial, deceptive or manipulative outbound links”. In general, this penalty is aimed at blogs that are specifically setup to sell links and link to all manner of sites which are not editorial references. Google penalizes the site by devaluing the site’s outbound links. This means none of the site it links to will get any SEO benefits.

Unnatural Inbound Links

This is the most frequently experienced Google manual penalty linked to the Penguin algorithm, and it is a pretty severe one. You’ll receive the dreaded “unnatural inbound links” notification in your Search Console. Affected sites are considered to be engaging in link schemes against Google’s webmaster guidelines such as being a member of manipulative link networks. The penalty’s impact can be partial or affect the entire domain. If you get hit, you could lose all of your organic traffic from Google literally overnight.

It is vital to conduct a full and in-depth backlink audit, and you must identify all of the toxic links. This means you will need to use different backlink analysis tools so you can find all of the links.

Recovering from this penalty is not easy, but this also depends on how bad your backlink profile is. To begin with, your first reconsideration request will most likely fail even if you have done everything right. Google does this to make webmasters look even deeper and clear things up that might be borderline OK.

Documentation of everything you’ve done to fix the situation is one of the primary things you need to do. Google doesn’t take reconsideration requests very seriously if that intricate documentation isn’t included. You really must show Google that you’ve done everything you can to correct the problem by showing a detailed report of what you’ve done to get rid of the toxic links. List out every dofollow link and list its contact information and action taken.

Upload the disavow file first (and receive confirmation of the change) before submitting documentation outlining all relevant steps taken to resolve the issues Google has highlighted. Include the outcomes of each attempt in your disavow file such as: no reply, removed, 5th try, pending, not contact info, etc.. Also good to list the type of link. Directory, blog post, comment, article etc. All of this will demonstrate to Google that you have really made an attempt.

Note that the domain may never fully recover unless you replace the bad links with links that have the same ranking power as the bad links. Once most of the links have been re-indexed, the algorithm must then be updated with the new information.

Now that Penguin is part of the over 200 signals used in Google’s core algorithm, Penguin-like updates more frequently so .

Algorithmic Penalties

Panda

The Google Panda algorithm was released on February 24th 2011, and was based entirely around the concept of content. It is essentially a content quality filter that analyses the quality of an entire website’s content. It was specifically designed to lower the rank of sites with “low-quality” or “thin-content”. It targets sites with duplicate, plagiarized or thin content; user-generated spam; and keyword stuffing.

As Matt Cutts, Google’s head of spam, put it in a blog post when announcing the first iteration of Panda in 2011:

“This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”

In a nutshell, Google wants webmasters to focus on delivering the best user experience possible on their websites so they can send users to the most relevant pages with the highest quality available on the web. If your site has a certain amount of what Google describes as poor quality content that provides little or no value, the entire site will be categorized as a low quality site, and it will be filtered from ranking high in the search results. Panda rollouts have become more frequent, so both penalties and recoveries now happen faster.

User Engagement

The Panda also made “user engagement” a ranking factor. Here are some of the factors considered by the algorithm:

  • how does the visitor engage with the website’s content when they receive it?
  • does the visitor share it?
  • does she comment on it?
  • does she stay on the site a long time?
  • does she access more than a page on that site?
  • does she leave within 30 seconds of arriving on the site?
  • does she return to the site or mention it independently afterward?

Today, even pages that are a perfect keyword match may now be filtered from the search results due to weak user engagement. So, ranking at the top of the first page of Google is not just about creating high quality content and getting more social signals and relevant backlinks pointing to your site. It’s also about how visitors to your site engage with it.

Panda is a site-wide penalty, not a page penalty. This means that if a site has just a certain amount of poor quality content that has been penalized by Panda, then the entire site falls below Panda’s quality algorithm, and the whole site is filtered out of the top ten search results.

Note that unless you have a considerable amount of low quality content, you won’t get hit with a manual penalty from Google. You just won’t rank high no matter what you do. If you’re experiencing ranking issues, don’t automatically assume that you need more links. Rather, consider performing a comprehensive content audit to identify whether or not your site does have a lot of what Google defines as thin, low quality content that provides little value.

Penguin

Google Penguin looks for spammy and irrelevant links. The algorithm works by analyzing the inbound link profile of every website for over-optimized anchor text. If a backlink profile contains backlinks without branded anchors, naked URLs, or universal anchors (ie, “click here, more info, read more or here), then the link profile is heavily optimized, and the site is likely to be susceptible to a Penguin penalty. Having non-descriptive text links such as “read more”, “Click here,” “check out this website,” and “visit us here” are great ways to keep your profile looking natural and richly diverse.

In addition, there are three main backlink factors that can be used to identify these types of link patterns:

  • Link quality – Sites with a natural link profile will include both high and low quality links. Manufactured link profiles tend to have lots of just low quality links or only high authority links (like from a private blog network).
  • Link growth –Sites with manufactured link profiles tend to build lots of links within a very short period. Sites that build links naturally tend to build links steadily over time. Avoid unusual spikes in link growth.
  • Link diversity – Legitimate sites attract links from diverse sources (contextual, blog comments, news sites, resource sites, etc.). However, links from very few sources (such as blog comments and directories) are considered manipulative.

Rather than affecting the ranking of an entire website, Penguin now devalues spam by affecting the ranking of individual pages based on spam signals.

Mobilegeddon

On April 21, 2015, Google released the mobile-friendly ranking algorithm. The update was designed to boost mobile-friendly pages in Google’s mobile search results. This update primarily boosts the rankings of the most mobile-friendly sites, so if your site is not mobile-friendly, rather than being penalized, it will be pushed down in the search results.

One of the best ways to prepare is to test that Google considers your web pages to be mobile-friendly by using its Mobile-Friendly Test tool.

Google also released a useful mobile SEO guide. In it, it explains the most common mobile errors such as blocking JavaScript or messing up your mobile redirects.

On top of those mistakes, here are a few more general mobile-friendly principles to keep in mind:

  • Avoid software that most mobile devices can’t render, e.g, Flash.
  • Use responsive design
  • Use a text size that is easily readable on a small screen (typically 16px or more)

Top Heavy

Google’s Top Heavy Update looks at your page layout and if it finds that the ads above the fold are excessive, it can be penalize your site and downgrade it in the search results.

According to Google’s Webmaster Central Blog when the first update came out in 2012, Google stated that they had received “complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. Such sites may not rank as highly going forward.”

This is a site-based penalty. That means that either all of your content is penalized or none of it is. Google has also confirmed that they will not penalize all sites with above-the-fold ads, but just those sites that occupy too much real estate vs. useful content in the top section of a webpage.

Google released a special tool at browsersize.googlelabs.com to help you visualize if your site may or was impacted by this.

Payday Loan

Google released the Payday Loan update to identify and penalize web sites that use black hat techniques to improve their rankings for heavily trafficked search key word queries like “payday loans,” “Viagra,” “casinos” and various pornographic terms. The update targeted spammy queries mostly associated with shady industries like super high interest loans and payday loans, porn, and other heavily spammed queries. The first payday loan update occurred in June of 2013. Payday loan update 2.0 occurred on May 16, 2014, with Payday 3.0 following shortly thereafter in June 2014.

Pirate

The “Pirate” algorithm was released in 2012, and was specifically designed to algorithmically penalize the growing number of torrent sites that were mainly used for pirating media and software. Google took a strong stance on piracy, which is essentially stealing copyrighted content.

The algorithm works based on copyright reports. If a site has a lot of copyright violations, it will be penalized by this algorithm. While new torrent sites can be established, they will be removed from the search results each time the algorithm is run if they have accumulated enough violations.

Fred

Google Fred was a fairly significant algorithm update released in March 2017. The majority of sites that were affected were blogs with thin and low quality content on all sorts of topics. The sites had a large amount of ads or affiliate links spread throughout the site, and seem to have been created for the express purpose of generating revenue rather than solving problems for users.

The Fundamentals of Organic Link Building

Importance of Link Building

Organic link building is a major factor in Google’s ranking algorithm and the most challenging element of search engine optimization. It is the most powerful and effective way to increase your websites’s visibility on the search engine results page and drive organic traffic to your site. Links remain the most important ranking signal for a high organic search ranking.

seo strategies

Links (also known as backlinks, inbound links or hyperlinks), are the lifeblood of any website. They are the way search engines discover websites.

Two critical points regarding links and link building:

  1. Link building is fundamental to SEO. Without link building, your site will fail.
  2. Link building should be never stop. It is an ongoing part of marketing your website on the web.

Having relevant backlinks from trustworthy and reputable websites lends to your site’s overall credibility and are used in the search engine’s algorithms to determine whether you can be considered trustworthy as a relevant source of information and an “expert” in your field. In short, links can make or break a site’s organic search ranking.

In a search engine ranking factors experts survey conducted by Moz, many leading SEO experts concurred that link building is the single most important objective for attaining high search engine rankings.

The right links to your site can boost your organic search ranking and significantly increase your visibility in the search engines. On the flip side, without contextually relevant links from well-established and authoritative websites, your site will never rank high in the search engines for your target keywords. Social and user experience signals may play a larger role in the future, but for now, Google prioritizes links when evaluating the quality of a webpage.

Essentially, Google relies on a site’s backlink profile to evaluate its relevance, usefulness and authority. Your on-page content tells the search engines what your site is about. However, it is the quality and, depending on your competitors, quantity of the links pointing to your site that Google will use to determine whether your site should be categorized as a high or low quality site. This is what will ultimately determine how high you will rank in organic search.

In April 2016, Google sent out outbound link penalties which were issued by the Google manual actions team against blogs and other sites suspected of selling links. There are millions of private blog networks setup specifically for guest posting as a form of monetary income. Google has been carefully watching and identifying these types of blogs for a while. They are beginning to take action. There is always a big risk in buying links from low quality blogs that have been setup specifically to manufacture links.

Link Building Mistakes

You cannot focus too heavily on link building alone as the main way to increase popularity on the web and gain higher rankings. SEO in 2016 is a socially-focused discipline, and your links need to look 100% natural and organic. With the incorporation of Artificial Intelligence (AI) into Google’s ranking algorithm, most loopholes that allowed black hat to thrive on the search engine have been firmly shut. As a semantic search engine, Google’s algorithms are now more sophisticated than ever, and they can detect manipulative patterns in any backlink profile very easily.

How Natural Are Your Links?

There are a variety of factors that Google will look at to determine how natural your links are. First of all, if you are steadily building links to your site, it means you are popular, and Google will expect to see mentions and searches for your brand. Google also needs to see social signals to your website that matches the pace with which you are acquiring links. If you’re consistently getting links to your site but aren’t getting any social signals and hardly anyone is talking about you or searching for your brand name on the search engine, this may trigger a few algorithmic signals.

Search Volume

If there is not enough search volume for the keywords you’re ranking for, you cannot be too aggressive with your link building strategy. This is very important. For example, if you’re ranking for a keyword that gets around 50 searches per month but are building 100 dofollow backlinks per month, it’s going to appear incredibly unnatural. If your niche doesn’t get too many keyword searches per month, you’ll need to factor this into your link building strategy.

Remember: Google wants the majority of links to your site to be editorial links. These are links from users to your site who find your content useful and relevant enough to link to your content pages. If there is no search traffic to back up your link popularity, a strong link building campaign may trigger algorithmic filters which could ultimately lead to a manual review of your website.

Sending the Wrong Signals

The primary goal for your SEO link building strategy should be to build as many organic, high quality editorial links and social signals. These should appear as natural as possible. If Google sees the creation of a lot of backlinks in a short period of time, that would send a signal. High velocity link building needs to be matched by a strong PR, branding or social media campaign. If not, Google may suspect that you’re manufacturing your importance and popularity because it isn’t natural. This can lead to problems that could affect the ability of your website to ultimately attain a high search ranking. This is why acquiring social signals is a must, even before you begin your link building campaign.

Even though social signals may not have any direct ranking effect as confirmed by Matt Cutts in 2014, the search engines still use data from social networking sites to determine how real or authentic your brand is. In any case, it will also look very odd if your site is generating backlinks but isn’t generating any social signals. You need to show off as many natural signals as possible.

This is why a social media strategy is as important as your SEO and link building strategy. The search engines look at how much authority your website has on the major social networking sites, how many people are sharing your content, how engaged your followers are with your content, how many +1’s, shares and retweets your content is attracting, as well as the authority of the people who retweet your content.

Link Velocity

With link building, there’s a general misconception that the more dofollow links you build, the better, especially if the competition has a massive amount. If you have received some good press lately or are running some major branding campaigns and are in a competitive industry with large keyword search volumes, then lots of dofollow links doesn’t look unnatural. As long as the links were not generated by automated link building software, there’s no need to worry about building links too fast. Furthermore, if the majority of links you’re building are nofollow links, then this will be okay as it will not look as if you’re trying to manipulate Google’s PageRank algorithm.

However, if you’re a relatively new site, building dofollow links too fast would be unnatural in some circumstances. E.g. if you’re not running a strong PR and branding campaign, how can you explain the upsurge in links? Your link building campaign needs to be backed up with a strong social media marketing campaign that is driving traffic at a steady pace. As long there are a lot of social signals hitting your website, you can get away with building more links than is recommended below.

Link Variety

It is also critically important to build a diverse variety of links from all types of sources with nofollow and dofollow links mixed in. As a new site, most of the initial links you build are probably going to be nofollow. This is okay because it looks very natural and shows you’re not trying to spam your way to a high search ranking.

Consider a local business in a niche where there isn’t much keyword search volume. Say you’re looking to rank 2 pages per week. You should be building no more than 1 or 2 links a week per page, per keyword. Note that the majority of your dofollow links should be going to your blog rather than to your home page.

In any case, you’ll want to link slowly starting out. Here’s the recommended link building velocity for the first 3 months for websites in more competitive niches:

Month 1: Branded links to your home page, with 85% of links going to your internal pages.

Month 2: All types of links with 75% going to your internal pages.

Month 3: All types of links with the majority going to your internal pages.

It is important to continually analyze the results you’re getting from your link building campaign. You need to evaluate the impact that each link is having on your organic search ranking and the overall SEO campaign. This will let you know which links are most effective for your particular situation.

Keeping Your Backlink Profile Clean

It is critically important to keep your backlink profile as free as possible of toxic links. If your link profile consists of more than 10% of toxic links, then you are in the Google filter risk group. According to Google Penguin-proof link building guidelines,backlinks are considered toxic when they are obtained from sites in any of the following categories, or under any of these circumstances:

  • Sites with links to pills, casinos, gambling or porn sites.
  • Majority of links are from a private blog network with the same subnet/IP.
  • Sites with excessive amounts of links or where there is a bigger ratio of links to content. Examples of these type of sites include forums that are poorly moderated and resource pages on low quality websites who will link to any types of sites.
  • Sites with excessive amounts of ads.
  • Hacked sites that with malware and other malicious software.
  • Poor quality web directories that are willing to allow any and all kinds of sites to be listed in it for a fee.
  • Sites that have been heavily spammed with comment spam.
  • Sites with a high percentage of anchored site wide links which are links placed in sidebars, footers or headers.
  • Linking site’s backlink profile consists of a high percentage of links with exact-match anchor text.

Toxic links will not count towards building your authority and rankings in the search engines, and the best way to avoid them is to stay well away from bad neighbourhoods. Google Penguin also made it easier to identify toxic links.

Toxic links include links that:

  • have a large amount of sitewide links;
  • are from linking sites with little or no traffic;
  • comes from webpages with a lot of out-bound links.

Are You Building Backlinks from Worthless Blogs?

Today, guest blogging is the number one strategy that most people use to build backlinks to their sites. Guest blogging is so popular because, unlike most other natural link building strategies, it is quick, easy and virtually stress free. Finding bloggers that are willing to sell links is a very simple process, especially with sites like Fiverr.

However, with the unnatural outbound links penalty which Google released in April 2016, the guest blogging game has surely changed. If you currently use guest blogging as your main link building strategy, you need to be well aware of this penalty. Google actually released an unnatural outbound links penalty in 2013 which was also aimed at sites that sell links, but it wasn’t necessarily aimed at guest blogging.

Here’s Matt Cutts, the former head of Google’s spam team discussing the penalty.

The difference between this penalty and the one released in April 2016 however, is that the latter penalty was aimed at blogs that were specifically setup to generate an income from selling links. What is really unique about this penalty is that it is practically impossible to tell if a site has been penalized because its organic visibility is not affected in any way.

In fact, you could have a penalized site with high Trust Flow, high Ahrefs rating and high Domain Authority. However the site is effectively worthless in terms of passing link juice because it’s PageRank has been reduced by Google to 0. This means if you’ve paid good money for the link, you’ll be getting no benefits whatsoever, which is exactly what Google wants.

In this article you’re going to learn more about this penalty so that you’ll know what to look out for when building organic links to your site.

The Value of High Quality Links

Links are the number one ranking factor in Google’s organic ranking algorithm. The algorithm was built on the fundamental concept that the more people that link to a site, the more socially popular, authoritative and trustworthy the site is, which in turn equates to better search engine visibility. Without question, link building is the most effective way to increase your website’s visibility on the search engine results page and drive organic traffic to your site. It is fundamental to SEO and without links, your site will fail.

Indeed, Google notes on their site that:

“In general, webmasters can improve the rank of their sites by increasing the number of high-quality sites that link to their pages”

The concept of link popularity has always been based on the simple premise that people link to good sites, and if a lot of people linked to a particular website, then it must be useful and deserves a high PageRank and a boost in rankings so people can find it faster and easier.

This is the fundamental concept that Google’s core algorithm has been built on. Once that becomes compromised, the entire system becomes corrupted and spammy sites with bad, low quality information become more visible to users, contributing to a bad user experience.

However, link building is also by far the most complex and challenging SEO activity, and over the years, desperate webmasters have tried everything humanly possible to manipulate Google’s ranking algorithm with one link building strategy or another primarily aimed at artificially boosting the ranking of their websites.

Spammy Guest Blogging for Links

By 2011, guest posting had become the most popular inbound marketing strategy and the number one way to build organic backlinks with the vast majority of blogs accepting money to post on their blogs.

All of this did not go unnoticed by Google, and Google’s Matt Cutts posted a a video in which he warned about the spam and abuse he was seeing in the guest blogging space.

In this video, Matt Cutts talks about how to guest blog without it looking like you paid for links:

In a subsequent blog post he wrote in 2014, an angry Matt Cutts went a lot further and outright warned webmasters to stop using guest blogging as a way to get links because in his opinion, it had become far too spammy.

In the post he even talked about a spam email he received through his blog from someone offering money for links from his blog. His warnings have fallen on deaf ears mainly because it is by far the easiest way to get dofollow links.

However, there is a problem. Google have made it extremely easy for site owners to waste money on completely useless links that do nothing for their site’s organic search ranking. We all know that whenever a link building method gets abused, Google is watching everything very carefully, and looking at how they can combat it, and SEOs have been expecting a clampdown on guest blogging for years.

Now, it would have been very easy for Google to simply penalize sites that sell links using the outbound links penalty they released in 2013. However, they have chosen to deal with guest blogging spam in a very subtle way that now makes it practically impossible to analyse the true value of a specific website’s link equity from looking at its metrics.

Google’s manual actions team quietly issued this penalty in April 2016. However, it went virtually unnoticed because of the way it penalizes sites. In fact, it is barely talked about, and the average blogger is not even aware that it exists. The outbound penalty devalues blogs by changing the site’s PageRank score to 0, making the site completely worthless for link building purposes.

Learn more about PageRank through these resources:

The penalty does not impact a site’s traffic, rankings or visibility. It only affects the PageRank of the site which is invisible to users. This means that if the site had a high Moz score, Ahrefs score, Domain authority, Page authority or Trust Flow when it was penalized, it will continue to have the same metrics even after it has been penalized.

Consequently, the site will continue to appear to the outside world to be a strong site even though the site owner would have received an unnatural outbound link penalty warning in their Search Console.

Below is a screenshot of the penalty.

google penalty

However, if they’re getting a nice little earner from their site, where is the incentive for a webmaster disclose this to potential buyers? So, the big loser in this case is the person buying links from the site because it means the buyer is paying for a link that is effectively worthless despite the strong metrics it might have. Essentially, Google doesn’t want people buying links to their sites, and this is their way of punishing the seller by rendering their site worthless, and the buyer for participating in the illegal activity.

Today, PageRank is the one and only signal you can use to know for sure whether a site has been penalized, but Google stopped updating the toolbar in 2013. So, there’s really no way to tell whether the site has been penalized by analysing a site’s metrics unless you know someone inside Google or the owner is honest enough to disclose this fact to you.

So, is there a way you can tell whether a site has been penalized?

While there’s no sure way of knowing whether a site has been penalized by analyzing its metrics, there are clues you can look for that should make you think twice before posting a guest blog on a particular website. This means you need to do some due diligence before you ever consider buying guest blogging links.

Red Flags to Look Out For

Majestic flow metrics

Trust Flow (TF) is a link authority metric developed by Majestic. It is used to evaluate the trustworthiness, authority and credibility of websites and directories that are linking to a particular site. This metric is scored out of 100, and sites with high Trust Flow means that the site has a lot of highly trusted sites linking to it, and as such, is likely to rank high in the search results.

Prior to the release of the outbound link penalty, this was a very effective way of identifying sites with spammy link profiles, and this is where you may want to begin your analysis. However, even if the site has good trust ratio, due to the reasons outlined above, you need to continue your analysis.

Is the site ranking for its brand name?

Type the domain name into Google without the TLD. For example, if you were checking out leadflowexperts.com, you would type in lead flow experts or leadflowexperts. The site should be the first result.

Note however, that if the site is using a highly searched exact match domain name, the site may rank on the 2nd or 3rd page because Google isn’t sure if the searcher is looking for the company or is searching for the exact terms. For example, cheapinsurance.com ranks on the 2nd page for their domain name.

If the site is using a unique, brandable domain name and doesn’t rank for its domain name, this is a clear sign that the site has been penalized. If you’ve been buying links and are not ranking for your domain name, your link profile might very well be full of link from blogs that have been hit with this penalty.

How contextually relevant are the sites it is linking to?

Consider the site’s outbound links. Is there an excessive amount of irrelevant outbound links? What type of niches is the site linking to? Poor quality sites accept articles from most niches. Even though they might be careful enough to avoid toxic links by not linking to gambling or adult sites, if the outbound links are completely irrelevant to each other (health, DIY, food, loans, finance, etc.), you’ll want to avoid such sites because they are potentially toxic.

Usually, these types of sites will have thin content posts of around 300 words with one keyword-rich anchor text link. You don’t want your links to live with irrelevant or low quality links. These types of sites are specifically setup to sell guest blogging links, and are a prime target for the penalty.

Does the site use overoptimized anchor text?

Much like the content on your pages, your backlinks’ anchors tell Google what your page is about. Look at the anchor text distribution on the site. Does the site use keyword-rich or exact match anchor text or does it use natural language when linking out to other sites? Does it link to internal pages or just external sites? Any website needs to use natural language in their anchor text, and there should be a wide variety of seemingly random anchor text including naked URLs, zero match anchor text (such as click here and other related words), branded, LSI keywords, page titles, etc.

Are the links editorial?

Google wants the majority of links to your site to be editorial. As far as Google is concerned, the only logical way a site would link to another page is if that it is a strong editorial resource. A site that is full of keyword-rich, dofollow links to product pages, sales pages, social media pages and home pages rather than natural links to relevant, informational pages is not a site that you want to get links from. For the most part, links to product pages need to be nofollowed because there is no editorial value in linking to a product page. It is obviously a paid link.

If the site has not been penalized, it is just a matter of time before the Google algorithm penalizes it because there is no logical reason why you would want to link to a product page or home page as a resource. If you are getting a lot of links from these types of sites, it is only a matter of time before Google hits your site with a manual inbound penalty.

Conclusion

It is critically important to do your due diligence before shelling out your hard earned cash, or you will simply be buying links that will not only be virtually worthless, but enough of those types of links will ultimately get you into trouble. If you’re not sure if a blog has been penalized, the best thing you can do is to contact the blogger and them to nofollow the link.

Nofollow links are a key part of a natural link profile, so even if the blog may have been penalized, the link to your site will look natural, and it will not look as though you’re trying to manipulate Google’s PageRank algo by getting a dofollow link, which would be discounted altogether.

While there is no sure way to determine whether a site has been penalized by the outbound links penalty, you should avoid sites that do not appear to be complying with Google’s ultra-strict quality guidelines. This is critically important because links are the most important signal in Google’s ranking algorithm.

Note that if your site is associated with spam, Google will have no trust in your site, and it will be very difficult for you to rank organically in search.

How to Optimize for the Rankbrain Algorithm – Infographic

rankbrain infographic

RankBrain is a machine-learning artificial intelligence algorithm built into Google’s core algorithm to improve the accuracy and quality of search results. Machine learning is where a computer teaches itself to do something rather than being taught by humans. Artificial Intelligence is where a computer can be as smart as a human being in the sense of acquiring knowledge from being taught and from building on what it knows.

RankBrain has been incorporated into Google’s core algorithm. It is a component of Google’s core search algorithm known as Hummingbird.

Here’s what you need to know about RankBrain

A major ranking factor. Google calls RankBrain the 3rd most important signal behind content and links.

Interprets complex search queries. RankBrain’s initial role was to interpret the specific intent of complex long-tail queries rather than focusing on individual keywords used in the query. Today, Google uses RankBrain to process every query handled by the search engine.

Converts keywords into concepts. RankBrain works by converting search terms into topics and ranks pages that comprehensively covers the theme.

Analyses user engagement signals. RankBrain identifies the best pages for given search queries by looking at user experience signals such as organic clickthrough rate, dwell time and bounce rate.

Ascribes quality score to webpages. RankBrain analyses web pages for relevance by assigning a score of between 1 and 10, which 1 being least relevant and 10 being most relevant.

SERP CTR is key. RankBrain analyses SERP clickthrough rate and determines whether users are satisfied with the result by looking at user engagement signals such as dwell time and bounce rate.

Increase CTR with emotional triggers & power words. Using emotional triggers, power words and brackets in addition to relevant keywords in your title and descriptions can effectively increase your SERP CTR for RankBrain.

LSI Keywords. Using LSI Keywords in your content gives RankBrain the context needed to fully understand your page.

RankBrain loves long form content. Optimizing for RankBrain involves publishing long form content that comprehensively covers the topic you’re writing about.

How to Optimize for Featured Snippets – Infographic

featured snippet infographic

A featured snippet is a summarized answer to a user’s query displayed in Google organic search results. It is extracted from a results page and includes the page title, URL and link. If you ask a question based query, Google will often answer with a featured snippet.

Here’s Moz’s Rand Fishkin take on how to get featured in Rank 0:

The pages that get featured are those that already rank in the top 10, so to increase your chances of appearing, the first thing you’ll want to do is to explore “Search Analytics” in your Search Console section to find out pages that have the best potential of getting featured.

Search Traffic -> Queries -> Filter Queries by various question word (how, when, where, etc.)

As an ‘unofficial’ test you can always type in Google the phrase/query you’re after, and then the name of your business or brand. For example, let’s say that you’re after the query “how can I make an omelette in five minutes” – Just go ahead and type “how can I make an omelette xxx brand name” and Google will provide an ‘unofficial’ featured snippet of your post to give you an idea of what it looks in that format and make any necessary amendments to make it even more visually appealing.

Featured snippets appear most for informational queries which may be in the form of questions, phrases, words, fragments or statements.

To see featured snippet candidates (that didn’t make the cut), add the parameter “&num=1”, “&num=2”, “&num=3” (and so on) to the end of Google’s URLs for queries with featured snippets.

The most important aspect behind getting a featured snippet is quality content, formatted in a specific way:

Being featured means getting additional brand exposure in the search results:

  • Ben Goodsell reports that the click-through rate (CTR) on a featured page increased from two percent to eight percent once it’s placed in an answer box, with revenue from organic traffic increasing by 677%.
  • Eric Enge highlights a 20–30% increase in traffic for ConfluentForms.com while they held the featured snippet for the query.

Formatting your content should take into account:

  • Paragraph It can be a box with text inside or a box with both text and an image inside.
  • Numbered list
  • Bullet points
  • Steps
  • Charts
  • Images
  • Table

Make your steps, bulleted list or paragraphs concise and tight. This may make it easier for algorithms to determine if your content is fit for the feature.

According to Getstat, the most popular featured snippet is “paragraph” type.

According to research by Ahrefs, 99.58% of featured pages already rank in top 10 of Google. So if you are already ranking high for related search queries, you have very good chances to get featured, and you don’t have to rank number one to be featured.

Page 1 of 212