How To Survive Google’s Unnatural Links Warnings & Avoid Over-optimisation
Warning: mysql_connect() [function.mysql-connect]: Access denied for user 'admin'@'localhost' (using password: YES) in /var/www/aks/data/www/aks.od.ua/links_base/init.php on line 7
Access denied for user 'admin'@'localhost' (using password: YES)
Posted by Modesto Siotos
This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.
Google’s actions against “overoptimised” sites have been intensified and it seems that the recent updates are able to detect various links-related overoptimisation violations that can result in short or long-term positional drops, with or without warnings. Even though Google announced the Penguin update on the 24th if April 2012, changes on Google’s link evaluation system have been noticeable several weeks before the public announcement.
It appears that in some cases Google has been overzealous, hitting (temporarily) even websites with rather natural link profiles. In other cases, Google admitted algorithmic classification mistakes, publishing apologetic messages like this one Matt Cutts posted on Google+.
According to Patrick Altoft (branded3) the sites that have received unnatural links notifications fall under five main categories, and they are not just the ones participating in link exchanges or other types of link networks. Rand made a Whiteboard Friday video about 6 changes every SEO should make before the overoptimisation penalty hits. Now that Google has rolled out the Penguin update against "black web spam," making sure that your website's backlink profile does not violate Google's quality guidelines is quite essential, especially if your traffic has gone down.
This is a follow up to the post ‘How to Monitor Your Website For Link Equity Loss’, which can be used to identify backlinks from low quality or penalised/deindexed websites. However, this post intends to cover the following links related overoptimisation cases:
A) Excessive Link Acquisition
B) Site-wide links detection
C) Unnatural Anchor Text Distribution
D) Unnatural Spread of Links Authority
A. Excessive Link Acquisition Check
Acquiring a high number of links over a short period of time has never been a good practice and webmasters need to keep an eye on the levels of acquired links, especially these days that negative SEO seems to become more of an issue. Phrases like the following one from Dan Thies, seem to be heard more often lately:
"Both sites have received “unnatural links” messages in Webmaster Tools. Neither site has had a “link building” campaign ever. By using 3rd party tools (e.g. Majestic) I can see a lot of unnatural links pointing at both sites, but I didn’t put those links there"
There are two quick ways to check your site's link acquisition velocity, using Ahrefs or Majestic SEO.
Extremely High Link Acquisition Velocity (Ahrefs)
Unnatural Link Acquisition Velocity (Majestic SEO historic index, cumulative view)
B. How To Check For Site-wide Links
A high number of blog-roll, header, footer or sidebar links can trigger Google’s “overoptimisation” wrath and keeping them to a minimum would be a rather reasonable thing to do. Certainly some site-wide links may have occurred naturally but the less "overoptimisation" signals you site sends out, the better. There are a few ways to quickly check your website against site-wide links with the quickest one being Webmaster Tools.
Under 'Your site on the web' -> Links to your site WMT list the domains that link the most to your site. The ones that link several times should be flagged as potential site-wide links and be manually checked.
Using Webmaster Tools to detect site-wide links is a rather easy and quick way. However, because WMT don't report all backlinks Google actually see, for a more thorough investigation a third party link intelligence service should be used such as Majestic SEO, Open Site Explorer, Ahrefs etc. One thing to bear in mind using any 3d party service, is that their crawlers do not try to replicate Google's behaviour, therefore in some cases the data can be significantly skewed. This is particularly the case for links from web sites that Google has removed from its index but the 3d party services will still report as normal links.
A section in the Ahrefs FAQ page reads: “Having the full information at hand you may decide on subjective estimates of links and figure out real situation with the given website or page”. In a similar manner a Majestic SEO rep commented that:
"Our index is independent of Google and will remain so. If Google has banned a site, it does not mean there are no longer links to that site. We map the link graph – not Google’s interpretation of it."
C. How To Check For Unnatural Anchor Text Distribution
Overoptimised anchor text seems like a ticking bomb, especially after Google made public the following two messages:
"Tweaks to handling of anchor text. [launch codename "PC"] This month we turned off a classifier related to anchor text (the visible text appearing in links). Our experimental data suggested that other methods of anchor processing had greater success, so turning off this component made our scoring cleaner and more robust."
"Better interpretation and use of anchor text. We’ve improved systems we use to interpret and use anchor text, and determine how relevant a given anchor might be for a given query and website."
This task is quite more complicated because, ideally, you need as much data as possible. Exporting anchor text data from as many different data sources as possible is strongly recommended e.g. Majestic SEO, Ahrefs, Open Site Explorer, Sistrix, Blekko.
Next you would need to filter the data removing the following:
- Dead links – These are sites that no longer link to your site but used to link in the past. Filtering out the dead links is absolutely necessary and some 3d party services offer such tools. Otherwise, proprietary link checkers can be used like the one we use at iCrossing UK - Alex Ovsianikov's creation. Counting dead links into a backlinks audit can result in wrong conclusions.
- Deindexed linking root domains – It's rather pointless carrying out a backlinks audit for Google including links from sites that Google has deindexed. NetPeak Checker, makes this task quite easy as it is explained on this post.
- No follow links – These are unlikely to cause any overoptimisation issues and could be discounted
- Site-wide links – These should be counted once, otherwise the anchor text distribution will be greatly skewed. Different services treat site-wides differently; hence you need to pay extra attention at how each service treats them.
After having applied the above filters, the remaining backlinks data could be analysed for different anchor text types such as:
- Exact match targeted keywords e.g. hr software
- Broad match keywords e.g. online hr software system
- Brand terms e.g. BreatheHR, www.breatheHR.com
- Keyword + brand terms e.g. Breathe HR software system
- Image links
- Other e.g. 'click here', 'this site' and other natural anchor text that doesn't fall under any of the above categories.
Having classified all different anchor text variations, it is now relatively easy to spot weaknesses – pay particular attention for spikes on exact match keywords as in the following graph:
D. How To Check For Unnatural Link Authority Spread
Don’t Let The Penguin Leave You In the Cold
About the author
Modesto Siotos (@macmodi) works as a Senior Natural Search Analyst for iCrossing UK, where he focuses on technical SEO issues, link tactics and content strategy. Modesto is happy to share his experiences with others and posts regularly on digital marketing blog Connect.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!