What are Google algorithms?
To provide the most relevant information to searchers, Google uses data from its index to produce algorithms and multiple ranking factors that help to rank web pages on the search engine.
In the past, these algorithm changes happened every so often, but, as technology advances and searches become more detailed, Google regularly releases new algorithms. Many of these updates are so minimal that barely anyone notices them! However, the following updates have affected websites drastically over the years:
- Core Updates
- June 2021 Update
Our helpful guide will walk you through every aspect related to these key updates and what you need to be aware of.
We will also be providing clarity on why they came into fruition and the changes SEO campaigns have had to face in response to them. So, let’s get started!
Launched: February 24th 2011.
Goal: The Panda update was implemented to reduce thin, low-quality content, and reward relevant, engaging content.
How it works: Google assigns each page a quality score which in turn is the ranking factor.
This update is one that changed the course of SEO as we know it. Over time, poor practices such as keyword stuffing and plagiarism built an untrustworthy and spammy reputation for SEO.
The effect this update had on web pages was mild up until 2016 when it became a permanent feature in Google’s core algorithm rather than just a search filter.
To avoid penalties from the Panda updates, here’s what you need to steer clear of on your website:
- Duplicate Content
- Thin Content
- Keyword Stuffing
- Poor User Experience
- User-Generated Spam
Regular site audits and crawls are key when locating these issues. To run a site crawl, there are many tools you can use such as Screaming Frog and Moz. Furthering this, when checking if your content has been duplicated elsewhere on the web, you can use tools to check plagiarism, such as Quetext and Grammarly.
A useful indicator of thin content on web pages is to look at the ratio between links and word count. If a page contains many affiliate links and images but little useful information, this would be considered thin content.
The word count for each web page will naturally differ depending on the keywords you are aiming to target or the objective or a page. For example, if the user is searching for a query that requires a quick response, e.g. the definition of a simple word, these pages tend to perform well with little content. However, for queries that require detailed and in-depth answers, pages are expected to exceed a certain number of words. 600 words is your minimum! To be clear, word count doesn’t define where you will rank so writing 2000 words not targeting intent or providing value will not improve your SEO ranking.
Launched: 24th April 2012
Goal: The Penguin update aimed to reduce the number of black hat link building practices including buying links from link farms (also known as PBNs).
How it works: Google penalises web pages that are linked to from poor quality sites with over optimised anchor text. This update only takes into account the links pointing to a site and does not consider the outgoing links.
Since its launch, there have been several updates for websites to solve and recover from this algorithm. Those who had gotten away with continuous spammy link practices began to see an impact in 2014.
In 2016, similar to the Panda update, the Penguin update became a part of Google’s core algorithm meaning that the impact of backlinks to your site are observed in real-time.
Disavowing bad links
To avoid facing a reduction in organic traffic, Google recommends using their disavow tool. However, this is suggested as a last resort when all proper, white-hat link building avenues have been exhausted. To do this, we suggest contacting the sites where the backlinks originate and ask if they can be removed.
In order to use the disavow tool, you need to create and submit a disavow file. Within it contains the backlinks you don’t want Google to take into consideration when crawling your site, whether they are domain-based links (recommended) or individual links. In turn, this could help reduce the potential decrease in traffic to your web pages.
Be very careful to avoid including high-quality backlinks by accident within the disavow file as this could directly affect your ranking. Furthermore, Google will replace any previous file with the new one instead of adding to it, so remember to include ALL the low-quality links, not just the new ones.
The results from this disavow file will not appear immediately, and it is not clear which links have been credited and which have not. Some websites will submit a disavow file and not see any change at all. This usually means that Google was already discounting these low-quality links anyway. With the Penguin update, Google can actually recognise and ignore the most naturally occurring low-quality backlinks. As such the perceived value of the disavow tool has been questioned in recent years – some SEOs still use it, some say it’s no longer necessary.
If you’ve been using manipulative link practices and want to redeem your site, it could be worth removing all of these with a disavow file. If you just have low-quality backlinks for no fault of your own, you could probably not do anything unless you’re massively concerned. An SEO agency can help advise on this.
To summarise, here’s what you need to do to avoid the negative effects of the Penguin algorithm:
- Locate poor-quality, harmful backlinks and request that they are removed
- Outreach to high-quality sites and win backlinks
- Submit disavows and keep track of these
Launched: 24th July 2014
Goal: The Pigeon update worked to reward local businesses on Google Maps so they have a chance of competing with well-known businesses within those locations.
How it works: Google updated several ranking factors to improve local search and their distance ranking parameters. This meant users were met with accurate proximity-based results within their location.
Local search results
The most widely acknowledged change that was witnessed from a series of updates following the initial launch of the Pigeon update was the ‘local pack’ in the search results. In 2017, Google began to provide searchers with three (previously seven) of the most relevant businesses underneath the search bar.
Businesses with a good online presence already, and those well-known brands, benefitted from this update. However, those smaller businesses with only a one-page website and contact form may have found it harder to compete in these results.
To provide all businesses, big or small, with a chance to compete in local search, Google launched Google My Business, previously known as Google Plus. Within this free tool, business owners can manage and edit their information provided in search results, including their NAP details, opening hours, reviews and more.
By having a well-optimised Google My Business, with the correct details and business information, everyone has a chance at ranking in local searches. However, there are several rules businesses must follow in order to benefit from this platform. It’s safe to say that the Pigeon update has changed local SEO for the better.
To summarise, here’s what you should do to Pigeon-proof your site:
- Create localised content e.g. location pages for services you provide to maximise chances of being associated within that radius. Make sure to include the NAP details on the top of each major location page.
- Focus on earning local citations with online directories with accurate and consistent details.
- Manage any possible duplicate business listings.
- Encourage positive reviews from customers.
- Manage the information on Google My Business and take advantage of the free tools it provides.
Launched: 20th August 2013
Goal: To understand the intent of search queries beyond keywords and provide users with increasingly relevant information by using semantic search and the knowledge graph.
How it works: Google uses semantic search to present results based on the user’s intent rather than the user’s language.
The Hummingbird update overhauled Google’s core algorithm, unlike the previous Penguin and Panda updates which acted as add-ons.
Google developed semantic search which aims to discover the deeper intent of search queries rather than just their face value. Prior to the Hummingbird update, Google created the knowledge graph. This was to ensure that searchers were met with instant answers to their queries in the form of lists, rich informational text, and images.
By using the knowledge graph and semantic search, Google transformed the way we search. SEOs were no longer having to produce content that would be read by robots but now read as if humans were crawling the site.
Hummingbird improved the relevancy of search results by offering users accurate web pages that answered their questions. A simple example of this is when searching for ‘best Indian places’, the results would show Indian takeaways close to the searcher, instead of presenting a bunch of articles describing the best places to live in India. However, the search would differ drastically if you were searching in Delhi.
To summarise, here is how to optimise your website to benefit from the Hummingbird update:
- Take advantage of voice search. The use of AI search systems such as Alexa and Google Home has seen an increase in recent years and so it is important to take into consideration this conversational way of searching.
- Think synonyms. Google will use results that show variations of keywords the searchers use and so it is beneficial to include a range of synonyms in the keyword research.
Launched: April 21st 2015
Goal: Google aimed to improve the overall user experience through the implementation of rewarding mobile-friendly sites.
How it works: Making websites more mobile-friendly via several requirements such as the content being proportionate to smartphone screen sizes.
Google had to further improve the user experience of search and to do this, they had to reflect the cultural shift and rise of smartphone users.
The impact this had on websites that weren’t mobile-friendly was significant. Google now has a mobile-first approach when it comes to indexing which applies to individual pages rather than websites as a whole.
In 2016, Google announced that they would provide even more reward to those who complied with the mobile-friendly optimisations. Since then, this algorithm has not only changed the way SERPs appear but also forever revolutionised how the website design industry works.
To summarise, in order to have a chance of ranking highly on Google, your website has to be mobile-friendly. Here’s how:
- Improve the loading speed of the site
- Reduce the number of ads and pop-ups
- Better the user experience by making the website responsive and easy to navigate
- Simplicity wins. Ditch the outdated technology and focus on provided users with a straightforward design
- Create CTA buttons that are large enough to be used on a mobile device
- Ensure font size is readable
- Include the meta viewport tag to manage the width of the page on whatever device it is accessed by:
- Add this to the HTML of each page: <meta name=”viewport” content=”width=device-width, initial-scale=1″>
Google Page Experience Update
Launched: Mid-June 2021
Goal: To improve the loading speed of pages and therefore improve page experience through new ranking signals referred to as Core Web Vitals.
How it works: The Core Web Vitals include first input delay (FID), cumulative layout shift (CLS) and largest contentful paint (LCP).
Find out more about the page experience update and how it could affect your website on our SEO Hub.
If you are interested in keeping up-to-date with algorithm updates, several great websites offer in-depth information as they roll out, including Search Engine Journal and Moz.
If you’re unsure you’re able to truly grasp the complex world (wide web) or algorithms, or you want to speak to an agency that knows what they’re on about, upUgo are here to help!
Trusted in the Industry by
Let’s work together on your next project
We work with household names and hundreds of small businesses to help them get more from online.