The Basics of Reversing a Google Penalty
Has your website plummeted in the search results? Not sure why Google dislikes your website? We have you covered. This article is a basic overview that will give some clarity in regards to the most common factors that can result in a website being penalised by Google.
There are many reasons this can happen, but essentially it is broken down into two components: on-site and off-site factors.
On-site refers to the website itself, off-site is a term used to describe backlinks.
Backlinks are links that point to a website and are a fundamental element of the Google algorithm.
At its core, there are two types of Google penalties: algorithmic penalty and manual penalty
An algorithmic penalty can be triggered as a result of on-site and off-site factors.
Once issues are corrected, the website itself should “bounce back”. This usually happens once Google discovers the changes and/or algorthim changes have taken place, such as Google Panda or Penguin.
A manual penalty is a lot more difficult to deal with. This relates to when someone at Google looks at the website itself and all the backlinks pointing to a website. Manual penalties cannot be lifted unless an employee at Google lifts the penalty from their end.
How do I know if I have been hit with a manual penalty?
Manual penalty “notices” are provided in Webmaster Tools under the “Search Traffic” tab, and listed under “Manual Actions”.
These penalties can be a result of on-site factors, but in most cases, they are the result of backlinks. Once the problems are corrected, the Webmaster must submit a reconsideration request to Google via Webmaster Tools telling Google what they have done to correct the issue(s).
How long does it take for Google to respond?
You should hear from Google in around 2-3 weeks. In some cases, however, it can take up to six weeks to hear back in regards to a Reconsideration Request. Google will inform the Webmaster if what they have done is enough to lift the penalty or leave the penalty in place. Usually they are not very forthcoming in telling a Webmaster what they need to fix, instead direction them to their “Webmaster Guidelines” as a point of reference.
Things to Look for When Trying to Identify a Google Penalty
Onsite Factors – Your Website
There are countless on-site (website) factors that can result in either an algorithmic or manual penalty. Most of the penalties in regards to on-site factors are algorithmic. Let’s cover some of the most common issues as a starting point.
- Over-optimisation: There are many ways to over-optimise a website, the most common is to keyword-stuff target keywords into the meta tiles, alt tags, content, anchor text (internal links), as well as the overuse of headings, bold text etc. Typically webmasters will use keyword density that is simply too high and unnatural. Natural keyword density is commonly around 1-3%.
- Hidden Content & Links: Hidden content can quickly trigger a Google penalty. In yesteryear, webmasters would hide content such as target keywords and links by using matching text colour to the background colour on their website. To the user, this content would be invisible, but to Google, the content could be read. Google comes down on these types of techniques. Other examples are the use of sneaky redirects, scripts and “read more” text hidden behind a link.
- Duplicate Titles: This can be the result of something completely innocent and outside the webmasters control. This is the result of page titles having the same titles or very similar titles across a large percentage of pages on the website itself. Issues with a CMS can trigger such a problem. It is important to keep the titles of each page as unique as possible.
- Inner Linking: This refers to the use of anchor text (text in a link) on a website and how keywords are used to inner link pages within a website. Overuse of keywords and repetition of anchor text (targeted keywords) can trigger an over-optimisation penalty.
- Low Quality Content: Google is much smarter at identifying low quality pages than it once was. Poor grammar and spelling can affect a site’s ranking and reduce the quality of a website in Google’s eyes. It is critical that webmasters ensure that their content is free of these issues.
- Duplicate & Scraped Content: Content is syndicated around the web all the time. Google does a great job at identifying the original publisher as the one to rank but they can, and do, get it wrong sometimes. If a webmaster has scraped, burrowed, or stolen content, a big percentage of their text is from other websites Google may deem the site is of low quality and rank it accordingly. Webmasters should ensure that content on their website is as original as possible, in order to avoid issues such as Google Panda affecting their site.
- Thin Content: Does the site only have a few pages? Is it set up only to target a few keywords? Google hates sites that offer little or no value to users. Usually the hardest hit are low quality affiliate sites that are only online in attempts to redirect users to sales pages or lead pages to extract email addresses etc.
- Overlapping Content: This issue is the result of multiple pages and in some cases even dozens or hundreds of pages created in order to target a particular keyword and its variations. It is critical that overlapping content be minimised. Google can deem the content is there for no other reason but to rank for as many keyword variations as possible and as a result, will penalise the website.
- Paid / Sponsored Links and more: Google has no issue with Webmasters selling advertising space on their website. The issue with having paid advertisements such as banners and sponsored links is that they must be set to “no follow” instead of the default setting, “do follow”. The direct cause of a penalty when such parameters are not in place is that Google does not want “link juice” to be passed if links are in place for advertising purposes. ALL advertising banners, sponsored links and everything in between should be set as “no follow” links.
- EMD: Exact match domains are domains that have commercial keywords as the domain name itself. In recent times it was very easy to purchase a domain and rank a lot quicker and easier if the keyword was the domain name itself. Google has come down hard on such tactics, but they still work very well; it is very easy to over optimise both on-site and off-site elements when commercial keywords are also the domain name itself.
- Hacked / Malware Infection: Google understands that sites can get hacked and as a result there are measures in place to both alert webmasters and allow them to reverse any damage within Webmaster Tools. Malware infected websites need to be completely clean before a malware reconsideration request should submitted to Google. Other issues with a hacked website is that hackers will insert hidden links pointing to websites they want to rank. These links are typically invisible to the user and are often hidden off screen. A quick and easy way to find out if the page has hidden links is to view the page source.
- Affiliate Sites: Google hates affiliate sites and for a good reason. Most of them offer very little value to the user and usually have content that is stolen and scraped from other websites. It’s not to say that affiliate sites cannot rank, they can do and can rank well. Unfortunately, the large percentage of them do a poor job of aligning with their target market and usually create websites that are thin in content and most importantly, value! These kinds of sites are only set up with one idea in mind: to make money as quickly and as easily as possible. It is very common for content to be stolen from others and almost impossible to police and control.
Off-site Factors – Backlinks
Backlinks are a fundamental component of how Google ranks websites. As a result, Google has upped the ante when it comes to eliminating certain back linking techniques and strategies that, for a very long time, worked. Not only did they work but also they were extremely effective and affordable. Algorithm layers such as Google Penguin has done a great job killing many of these techniques.
Here are some factors when looking at the backlinks of a website to determine if a website has been penalised or had their backlinks devalued by Penguin and other algorithmic changes.
- Anchor Text Ratios – have keywords been hammered in the anchor text of backlinks? Does anchor text look natural or is it smashed with their target keywords repeatedly? Natural anchor text levels vary between keywords, but less than 20% is ideal.
- Link Volume – Do you have any links? Do you have enough strong links to rank for their keywords within their niche/market?
- Link Diversity – Do you have different “types” of backlinks pointing to your website?
- Link Velocity – Speed that backlinks are being discovered; sudden spike or huge drop?
- Spammy links? Such as forum profiles, low quality social bookmarks, article directories etc.?
- Relevancy – Do you have a large percentage of their backlinks from irrelevant websites?
- Too many links to your homepage / low volume to inner pages?
- Site-wide links? Do you have 1000s of links coming from the same domain, ie footer / sidebar links etc.?
- Referring IPs – big percentage of your links coming from the same IP address?
- Do Follow vs No Follow Ratios – Do you have any “no follow links”, unnatural ratio?
- Paid Links – Are your sponsored / paid links “dofollow” instead of “nofollow”?
- Foreign Links – Tonnes of links from foreign websites – Russian, Chinese etc.?
- Have they been hacked? – Do you have 1000s of non-targeted links pointing to pages created by Malware or a hacker?
- SAPE Links – Do you have links on hacked websites? These are typically Russian websites; links in the footer and/or sidebar or hidden within the source code.
- Check WMT for Manual Penalty – Check “manual actions” under the “Search Traffic” tab in WMT. Do you have a manual penalty in place?
Bad Backlinks? Google Disavow Tool:
Google is aware that webmasters cannot control people linking to their website. Since they have become more ruthless and unforgiving when it comes to backlinks, they had no other choice but to allow webmasters the ability to “disavow” backlinks.
Google Disavow Tool allows webmasters to submit a list of links to Google to instruct them to remove any credit and association to the links pointing to their website. Submitting disavows to Google does work but it can be difficult to hit the nail on the head. Disavow too many links and the website may suffer in rankings as a result. Not enough and any “filters” in place that are limiting the website to rank will remain in place.
Have You Been Hit by a Google Penalty?
Reversing a Google penalty can be tricky. Please keep in mind this is a very basic guideline. If your website has been hit with a Google penalty, it’s important that you contact an SEO professional to begin the process of recovering your rankings.