Why SEOS need to be able to predict the future...

posted 21 Mar 2016, 05:49 by C Byrne   [ updated 30 May 2016, 06:40 ]


(like the original Google algorithm update know as "Penguin"!)


When Google announced the Panda update in February 2011 to “provide better rankings for high-quality sites - sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.” (http://googleblog.blogspot.co.uk/2011/02/finding-more-high-quality-sites...) it made me change the way I worked in a small but significant way.


Before going freelance as a SEO and PPC consultant in 2008 I worked at http://www.voodoo.co.uk/ with Gordon Tebbutt as my manager and he taught me a lot. One technique he taught me was article distribution as a link building tool: which consisted of:

1. writing a quick article with a link (with sculpted anchor) to a client's site in the byline,
2. submitting it some article directories like Isnare, in the hope others will publish it (with the byline) on their own sites to add some (free) content to their site

This was (at the time of the Panda update) still an effective tool. I knew that due to the amount of content that the sites with syndicated articles had that was duplicated elsewhere on the web, they would be easily classified as low quality. Thus, a potential consequence of this was the value (including Google PageRank) of any links from these types of sites would be likely go down (to zero). I have never really got involved with press releases for SEO purposes but assumed that sites with a large percentage of their content being press releases would be affected by this too for the same reasons.

Thus in my freelance work I stopped this practice right away - over a year ahead of the Penguin update from Google. I could see that Penguin (or something similar) might be a direct consequence of Panda. The Penguin update was aimed at decreasing search engine rankings of websites that violate Google’s Webmaster Guidelines by increasing artificially the ranking of a webpage by manipulating the number and quality of links pointing to the page (these tactics are commonly known as “link schemes”). It took a very long time for Google to start to get to grips with low quality links (only as recently as 2012) and this work by Google still goes on to this day.

In 2013, Google stated that links in Press Releases should use 'nofollow' like paid links : see http://searchengineland.com/google-links-in-a-press-release-should-be-nofollowed-like-advertisements-168339.

In October 2012 Google announced a tool that enabled webmasters to disavow links to their sites, effectively creating via Penguin a list of low quality sites for Google to check the Panda algorithm against. More free labour for Google from SEO consultants, like webspam reports! The disavowal tool for Penguin creates a "Feedback loop" (a ​system for ​improving a ​product, ​process etc by ​collecting and ​reacting to ​users' comments) for the Panda and Penguin algorithms by which they can be refined.  This feedback is through a list of sites containing low quality links (which may then by inference be considered low quality sites). In a real sense if you disavow links to a website you may be helping Google refine / write part of the Google algorithm.

In 2013 the following practices were listed as link schemes:
1. “Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links”
2. "Links with optimised anchor text in ... press releases distributed on other sites”)

In 2014, Google made a video on their view on article distribution as a link building technique:


Should I build links using article directories?



I hope you enjoyed this post. If I may, let me ask you to do at least 1 of 2 things (it will help me a lot and I thank you in advance):

1. Please share it through Twitter, Reddit or whatever works for you.
2. Add a comment!
Comments