Frameworks For A 'Future Proof' SEO Strategy
Post date: Jan 24, 2017 2:31:59 PM
Over the past 10 years many big brands (e.g. Expedia) with experienced staff at the helm (often working alongside leading SEO agencies) have experienced ranking drops and / or penalties from search engines like Google over non-compliance with their guidelines.
Many SEO agencies' own website rankings have also had ranking drops and / or penalties (e.g. freshegg .com). 'Black hat' techniques such as paid links are still in use and can still work well to this day (over 4 years after the introduction of Google's first effort to address this). Black hat link building nowadays can include hacking sites to get links...
In this article I will outline some frameworks and best practices for a 'future proof' Search Engine Optimisation strategy: i.e. one that should not get a site penalised at some point in the future by search engines.
This is not to say that all techniques compliant with these frameworks will be effective forever. I have done many disavowals and reconsideration requests for clients (only one of them I had worked for before and I'm 100% certain my link building wasn't what caused their ranking drop). I won't claim to be 'whiter than white' but I have been working in the SEO industry for over 10 years building links to clients' sites and have only ever been asked to remove 1 inbound link I built previously to a client's site in over 10 years in the SEO industry (and have never been asked to disavow to Google any links I built). This is mainly because I never have underestimated Google!
Imagine the scenario of the lone SEO pitting her wits vs the thousands of people with a PHD working for a search engine like Google globally. The lone SEO consultant / company is not cleverer than Google as a whole (technologies & staff combined)!
From my 10 years experience in this field, a website's User Experience is core for Google i.e. quality information is usable information. When I first read the Google SEO beginners guide around 8 years ago I was struck by the importance placed on alt text (as this is important to the blind to help them understand the content of an image). Search engine spiders don't have eyes either! This explains why factors like site speed and (mobile) usability are all parts of the Google algorithm. This is also probably why Google appears to be ambivalent about ad blocking technologies as they can enhance users' experiences of browsing the web.
Search engine guidelines
Still to this day I hear (even from senior people who should know better in the IT & web design industry) that SEO is witchcraft or some type of dark art. SEO in 2016 is no longer a mystery. The Google Webmaster Guidelines may be considered the 'laws of the land' in search in countries like the United Kingdom where it has a dominant market share. Do you want to be an outlaw and get banished from the major search engine in your locality?
Google recommends best practices for various aspects of web design such as responsive design, touch design and site speed / 'instant' mobile websites. Developing sites along these lines may be considered a best practice. Bear in mind Google's advice may change over time!
Most search engines have webmaster guidelines and searching for the common aspects can be useful. Do not forget to look at the guidelines of the most popular search engines in your non-target areas (e.g. Yandex in Russia).
However, not all of search engines' communication is through their Webmaster Guidelines and formal channels. For example, Google's written guidance on what makes a high / low quality web page may be considered to be vague and brief. themoralconcept.net/pandalist.html is a collection of feedback from various Google communications (including Webmaster Hangout question and answer sessions online) that tells us (more) clearly what they consider to be evidence of High Quality vs Low Quality web page factors. I have created a Twitter list of themoralconcept.net's Google Quality experts: https://twitter.com/SEOTipsnTricks/lists/experts-on-quality.
Google relatively recently first published (and since updated) it's internal search quality rating guidelines (for human search quality evaluators). Google’s search quality team uses human raters (dubbed “search quality evaluators”) who rate sites for quality. Their work may be considered part of a feedback loop relating to the Panda algorithm update (see 'Predicting the future updates to the Google algorithm based on past updates' section below), often to assess proposed amendments to the algorithm. To note the changes in emphasis in the guidance over time can shine a light on Google's inner workings and changes in the algorithm.
Doorway pages remain a massive problem in my view on Google. Who knows, one day the Webspam team might start reading the Wall Street Journal, FT etc and they might start to get a grip in various ways...
New technologies
Machine Learning (aka Rankbrain) is a major factor in search rankings on Google and may be a similar system to PageRank. However we’ve been told by Google that there is no Rankbrain 'score' and you do not optimise for it. However this may enhance Google's understanding of the differences between navigational / informational / transactional search queries in relation to 'partial match' type-domain names and link spam which appears to be a problem to this day.
Search behaviour is no longer a desktop only world.
The size of browsers / keyboards on smartphones may lead to different search behaviours
The rise of voice search (in the case of search engines) is just a different way to search the same database, and with Google Now the 'query' format is different (often a question).
It is useful to be aware of the browser 'add on' technologies that may affect the rendering of your site. For example, you should test your site with the popular adblockers on various devices & operating systems. It might be affecting the rendering of your 'non-ad' content.
Legal guidelines
It's always worth keeping up to date on relevant legislation relating to the web globally / in your locality e.g.
Ensure your domains comply with ICANN regulations - the rules of the organisation responsible for coordinating the several databases related to the namespaces of the Net.
(Even if you don't have ads on your site) digital advertising policy and regulation. Google's global guidelines on paid / sponsored links (aka 'link schemes') is similar to UK law on advertorials.
Making your site compliant with accessibility legislation e.g. the Disability Discrimination Act in the UK can help with SEO and increase "the long click" by facilitating engagement with the site.
Cookie law - nearly every Google Adsense publisher must comply with EU cookie consent legislation, not only those based in the EU
Predicting the future updates to the Google algorithm based on past updates
This (alongside never underestimating Google) is probably the main thing you need to do to make your SEO strategies future-proof. When Google announced the update known as Panda in February 2011 to “provide better rankings for high-quality ... with original content and information such as research, in-depth reports, thoughtful analysis … ”, it made me change the way I was working in a minor but significant way. I predicted (vaguely) the next major Google update!
Before I went freelance I worked at an e-commerce agency called Voodoo with Gordon Tebbutt and I learnt a lot from him. One technique I learnt was article distribution as a link building tool which consisted of:
1. writing an article with a link (usually with sculpted 'money' anchor text) to a client's site in the article's credits
2. submitting it (for a fee) to article directories like Isnare, where content was syndicated to other sites
This was (at the time of the Panda update) still an effective tool to build inbound links which was at that time contributing to good (top 10) rankings for clients for very competitive key phrases (e.g. “sheet music” on Google UK) . This technique would have been classified as "white hat" (compliant with current search engines guidelines). I knew that due to the amount of content that the sites with syndicated articles had that was duplicated elsewhere on the net, they could be 'easily' classified as 'low quality' by search engines. Thus, a potential consequence of Panda was that the link building value (including Google PageRank) of any links from these types of sites could likely go down (to zero or possibly negative). Most of the sites weren't well ranked on Google thus wouldn't pass any valuable traffic (especially as the articles were generally of low quality). I hardly ever used press releases for link building purposes but assumed that sites with a large percentage of press release content would be affected by this for the very same reason. This is an example of how Google's view of a practice changed over time (from white to black hat).
Thus I stopped this practice straight away - over a year ahead of the update known as Penguin from Google and 2 years before it was 'outlawed' in their Webmaster Guidelines. It was obvious to me that Penguin (or something similar) might be a direct consequence of Panda and follow quickly behind. Surely enough, just over 1 year later the Penguin update (in April 2012) was created to decrease search engine rankings of websites that violated Google’s Webmaster Guidelines by working to increase artificially the ranking of a website by manipulating the number & quality of links pointing to the it - these tactics are known as “link schemes”). It took a very long time for Google to start to get to grips with low quality / paid / sponsored links (only as recently as 2012) and various link schemes still go on to this day...
In 2013 (2 years after I stopped article distribution and 1 year after Penguin) the following practices were designated as link schemes by Google:
1. “Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links”
2. "Links with optimised anchor text in ... press releases distributed on other sites”
In 2014, Google's Matt Cutts made a video on their view of article distribution as a link building technique. It was important to draw as many possible logical conclusions as early as possible from the announcement of Panda to protect my business! I was also able to advise businesses I was working on that any value of press releases for SEO purposes was likely to diminish.
Predicting the future changes to the Penguin aspect of the algorithm is something we will all have to do as since late 2016 Google are "not going to comment on future refreshes". Matt Cutt's replacement remains anonymous - possibly Google will again be less communicative on all Webspam related matters in future...
Future changes to the Penguin aspect of the Google algorithm might include:
targeting suspicious links with 'money anchor text' being generated by hackers on compromised CMS systems (notably Wordpress)
Penalising people who falsely claim to be Google Partners (using the logo etc). This is very easy to detect!
targeting sites ranking 'out of the blue' for P0rn, Pill, Casino & 'Cheap Luxury Goods'-type phrases (when this was never before a topic they covered or ranked for).
Penguin using machine learning within the algorithm as (to the best of my understanding) Panda, reconsideration request and link disavowal data are all usable as part of 'quality' feedback loops and it is argued that human curated data feedback loops are critical for machine learning platforms.
More 'detective'-type investigation of 'black hat' SEO by Google, profiling companies mentioned in reconsideration requests etc. This may include providing information to legal authorities on illegal activities e.g. sites buying hacked links (to the best of my understanding this hasn't happened hitherto).
Interlinking of penalties / ranking drops to business profiles on Google Map (as to the best of my understanding these have remained separate post-Penguin)
Google making barriers to entry in Adwords for businesses historically engaged in egregrious Webspam
Future proof link building
In theory all link building for SEO purposes is against Google guidelines: “Any links intended to manipulate PageRank or a site's ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines.” If you don't build links, arguably you don't do SEO!
Since the Penguin update one popular 'black hat SEO' (techniques and tactics don't comply with search engines guidelines) link building technique has been Private Blog Networks (aka PBN). A PBN (also known as a link farm), is a group of blogs / websites created for link building purposes that are owned by the same entity but rendered invisible to 'all' backlink tools but visible to Googlebot and other search engine spiders). PBNs are only invisible to backlink tools if the tools are blocked from accessing / indexing the site (to avoid exposure of these links) via robots.txt or an .htaccess file (a hidden file on the server that can be used to control access to your website), and new 'unblocked' backlink tools can crawl PBN sites and expose them to their competitors to report to Google etc. Google can analyse robots.txt of sites and see if they block the crawlers of popular backlink tools like Majestic, Ahrefs etc and this can potentially be a flag on their site. When we will see a search engine for robots.txt files?
Conclusion
To summarise, eventually Google can and surely will catch up with (all?) 'black hat' techniques (even if some still work well post-2012 and show no signs of going away). So to be in line with the current Webmaster guidelines of the major search engines in your locality & modest in one's expectation management with clients is useful! You should ensure that what may penalise your (client's) site doesn't also end up penalising your consultancy / agency business financially.
Using 'black hat' techniques because of the (short-sighted) business goals of maximum profit this financial period might result in your site being de-indexed by Google! If you want to 'churn and burn' that's fine but it can & will have multiple consequences. Yet people still do it to this day as Google detection of link spam still relies on user reports and appears to be mainly on a network by network basis.
Best practices and resources for future proof SEO include:
bear in mind that Google has been argued to be waging a propaganda war against the SEO industry. Don't drink all of the Kool Aid (including this article, moz.com etc) - the last third is usually backwash!
manual link building only (ideally in-house with management oversight) and all link building reported formally. Outsourcing (or even doing) off-page SEO (and / or PPC) when the Marketing Director is not extremely 'search engine savvy' is not recommended. Do not use sculpted 'money' anchor text. Search engine compliant link building lead by a digital-savvy PR consultant may provide the best return on investment and the most traffic. Link building for traffic is as important as link building for PageRank!
regular SEO strategy reviews with written assurances (by SEO manager and / or agency) that ensure all (past and) present SEO tactics are compliant with current best 'future-proof' practices in the context of search engine guidelines.
If you are going to be 'black hat' then you should protect yourself contractually against any backlash in the event of a search engine action against your (client's) site.
Studying the rich & detailed Google ranking drop and / or penalty case studies on linkresearchtools.com can be considered a good list of "what not to do" / "what used to work" in link building. Learn the history of black hat SEO, as techniques such as 'parasite hosting' can still be of use.
To help you keep up to date on Google www.seobythesea.com analyses their latest patents so you don't have to (for example the Panda patent was granted three years after it's launch). He may not however realise all the implications of them though!
We hope you enjoyed this post. If we may, let us ask you to do at least 1 of 2 things (it will help us a lot and we thank you in advance):
1. Please share it through Linkedin, Reddit or whatever social medium works for you :)
2. Add a comment!
This article was originally commissioned by Supermetrics. com