How Google Search Algorithm Updates Affect Your Business
07/11/2017 | Digital Marketing | 15 minutesLike it or not, if you want to rank on Google Search then you need to pander to their ranking algorithm. The Google Gods will be most displeased if you don’t and you’ll struggle to see first page rankings unless you follow their ever changing guidelines. If you’re not immersed in the world of SEO on a daily basis then it can be difficult to keep up with these changes, let alone edit your website to comply with them. More importantly, almost all of the businesses we work with see their website as a vehicle to drive sales of their product or service, so if you’re not ranking on page one then you’re going to struggle to attract new customers or clients organically.
We’ve taken a look at how these algorithm updates can affect the search visibility of your website, the impact of this on conversion and the overall impact on your business, as well as how you can react to these changes and stay ahead of your competition.
How do I know if I’ve been affected by a Google Algorithm Update?
Unless you‘re checking your organic traffic and keyword rankings on a regular basis, you won’t know if you’ve been hit by a Google Algorithm Update until it starts impacting your enquiries or sales. At Urban Element we use Google Analytics to track our organic traffic, the SEMrush sensor to measure volatility in the SERPs, the SEMrush keyword rankings module and Moz’s Google Algorithm Update Timeline to monitor confirmed Google Algorithm updates.
Pandering to Panda
Google Panda has been around since 2011, but Google released updates to the content centric algorithm up until 2015. Panda aims to give users better quality search results by giving pages a quality score based on content being useful and unique. Pages that meet their criteria will rank and those that don’t will inevitably bomb. Finding out if you’ve been hit by Google’s Panda update would mean looking at historical Google Analytics data from as early as 2011, or taking a look at when the subsequent updates happened and correlating this with organic traffic. Once you’ve identified you’ve been smote by Google’s Panda, you’ll need to identify where your shortcomings are.
How do I recover from Google Panda?
Depending on the size of your website, Panda can be relatively easy to recover from in terms of complexity; the difficulty will come in the time it takes to recover. You may need to create fresh content, cull any pages with thin (short) content or update content on any old, outdated blog posts. First, you’ll need to find out the offending pages on your site. To do this you can use a number of tools;
- Screaming Frog SEO Spider (FREE / PAID) – A number of the tabs in Screaming Frog’s SEO spider tool allow you to search for duplication in Meta Titles, H1s, Meta Descriptions and more.
- SEMrush On Page SEO Checker (PAID) – The On Page SEO Checker from SEMrush gives you a number of content optimisation ideas based on what your competition is doing and what you’re not, including advice on keyword stuffing and keyword cannibalisation.
- Copyscape (FREE / PAID) – The above tools can help you with on page content issues, but aren’t able to check duplicate content issues from across the web. Copyscape enables you to type in a URL and see if the content from that page has been used elsewhere on the web. Perfect for catching content thieves who think their copypasta from your fresh, original content will go unnoticed.
What about Google’s other pets?
Many content specialists band about the phrase “content is king” and Panda’s close relation Hummingbird would agree with you. Like Panda, Hummingbird aims to penalise sites with poor, keyword stuffed content, but places more importance on searcher intent. In essence, Hummingbird has paved the way for devices such as Google Home, Amazon Echo and the personal assistants on our phones such as Siri. Our searching habits have become less direct and more conversational, meaning giving the user relevant results as quickly as possible has become increasingly important.
How can I use Hummingbird to my advantage?
The key here is to answer questions from a unique perspective. Finding a niche that other content creators haven’t covered is difficult, but not impossible. I like to use the following tools to find out what questions people are asking;
- Google Trends (FREE) – If you want to discover what people are talking about then Google Trends has all the answers. Trends is great if you’re going to start writing straight away, but less useful if you’re trying to create a calendar of content to blog or post about over an extended period.
- SEMrush Keyword Magic (PAID) – Although still in BETA testing, SEMrush Keyword Magic allows you to view the questions being asked around a particular search query, how difficult it would be to rank for that query and the query’s search volume.
- Google Adwords Keyword Planner (FREE / PAID) – If you have an active Adwords account then the chances are you’ve come across Google Adwords Keyword Planner. Prior to using SEMrush I would type “how, what, why, where, when” into ‘Keywords to include’ and get a list of common questions this way.
- Keywordtool.io (FREE / PAID) – Like SEMrush Keyword Magic, keywordtool.io allows you to type in a query and see the questions related to it. However, you don’t get any other metrics unless you pay, but if you combine these queries with Google Adwords Keyword Planner then hey-presto, you’ve got search volumes and competition scores.
- Forums (FREE) – Okay, not strictly a ‘tool’ but by browsing forums you can easily find what questions people are asking and which questions may not have been answered yet. Use the Google Search operator ‘inurl:forum’ after your search query to get a SERP full of forums that are related to your search query.
Pleasing the Pigeon and the Possum
Pigeon was released in 2014 to help Google understand where its users are located and adjust its SERP to show results that are close to the searcher. You can help to please the pigeon algorithm by adding organization or local business schema.org mark-up to your site; a universally understood mark-up used by search engines to gain a better understanding of website content. Another way of telling Google where you’re located is by adding your business listing to local directories with consistent NAP (Name, Address, Postcode).
More recently, in 2016, Google released an extension to its local algorithm called Possum which placed more importance on the physical location of the searcher. As we have learned at Urban Element, it’s a challenge to rank for “web design oxford” when your physical address is in Witney. For ourselves, and many of our clients, this has prompted us to focus more on tracking keywords based on user location, rather than national keyword tracking.
What about backlinks?
Links have been important to Google’s ranking algorithm from the very beginning and its spiders use these links to crawl from site to site, indexing as they go. However, this algorithm was open to exploitation and it wasn’t uncommon for SEOs to buy backlinks in an attempt to gain higher search rankings. When Penguin was launched in 2012, Google had a very clear goal; to audit these links and crack down on any unnatural or manipulative linking practices.
How do I recover from Google Penguin?
Dealing with internal links is easy, either remove the link or slap a ‘rel=nofollow’ on it. External links are trickier to tame. Most of the time external backlinks are out of your control; however, Google recognises this and offers a ‘disavow tool’ where you can ask Google’s robots to ignore links from a particular domain or page.
How do I find bad backlinks?
Google Search Console allows you to see the sites within its index that are linking to your site but it’s difficult to know which links are good and which are bad. Many SEOs will suggest AHREFs or Majestic SEO for deep link analysis, but for a lot of businesses the cost of these tools will outweigh the benefits. Personally, I’m a fan of using the SEMrush Backlink Audit tool which breaks down links which it believes to be ‘toxic’, ‘potentially toxic’ and ‘non toxic’. This way you can see which links are doing harm to your visibility and which are beneficial to your overall SEO, you can even create your own disavow file from within the tool. Pretty neat, huh?
What is Google’s most recent algorithm update?
Fred is Google’s most recent algorithm update. Fred is similar to Panda in that it targets pages with thin content, but it also takes a disliking to content that’s in article form with limited types of media on display. In addition, Fred targets sites with a large number of affiliate links and adverts that dominate the page, meaning that many sites that rely on advertising revenue are having to tone it down with their ratio of ad content to page content.
TL;DR
Google just wants you to create good, useful content that is shareable, unique and will generate traffic organically. They have a number of basic principles and specific rules set out in their Webmaster Guidelines. If you’re interested in staying in Google’s good books then we suggest you check it out.