Fred – Google’s Algorithm – The Next Search Engine Update
There has been a lot of speculation, discussion and stress in the last few weeks within the SEOSearch Engine Optimisation community. With what was initially an unconfirmed Google Algorithm update and after a few weeks of industry pressure, Google’s Gary Illyes mentioned during the AMAAsk Me Anything with Google session that indeed if any sites were not following the basic webmasters principles for quality websites (and ultimately content), they would have been affected by the Fred update. Google’s John Mueller shortly thereafter confirmed this update and the industry has been tracking its impact since.
Now keep in mind that apart from the webmasters guide and various other help articles from Google, they are well known to avoid confirming algorithm updates (e.g. the apparent content and link quality updates in February). Thanks to the hard work of SEOSearch Engine Optimisation gurus like Barry Schwartz, certain questions were answered and information around this update has become more clearer – site architecture and user engagement barriers will be causing heavy penalties. He picked up a further tweak to the algorithm earlier this month and this points to Google fine-tuning this update even further – something that all brands should be taking note of.
A Consistent Push for Quality
Remember that this update, coupled with the unconfirmed link quality update in February, follows on the updates towards the end of last year that we covered – further steps to update the Google algorithm, holding the majority of global searches, to weed out black-hat techniques and avoid the massive marketing mechanism that has sprung up around the over abuse and monetization of content – users these days are bombarded with irrelevant and far-reaching content that diminishes the quality of the brands user experience.
A Focus on User Experience
So at the expense of users, the majority of the industry took a bad turn with techniques such as link spamming, content spamming and ad heavy sites – those who relied heavily on these techniques are being hardest hit. And with the latest Fred update pointing to the core webmasters guidelines from Google, brands are having to rethink their approach – with quality services from SEOSearch Engine Optimisation and content providers in line with these practices.
What does this all mean?
Brands are going to take the time and consider their digital presence no more than ever. Legacy sites or platforms that cause issues in page URL structures, result in thin or content-devoid pages, those that are not built correctly or function quick and in an user-friendly manner for the majority of internet traffic (i.e. smartphones and tablets).
Expensive link building and content creation / content monetization strategies need to be re-considered and agencies providing these services need to consider the benefits over long-term by providing more micro-level attention instead of focusing at a macro level and then with a high bleed rate.
Readability should focus on easy to understand and user friendly writing. Avoid technical jargon and provide insight, bespoke understandings and show that you are authoritative and you have skill within your field in.
Timeliness, content should be fresh, ongoing and not dwindle up. Relevant, seasonal or content tied to trending topics is ideal.
Having content that is relevant, fresh and easy to read forms the basis of content quality and where any content quality audits would focus. Links in search results with low content quality will already be suffering and would be visible in your reporting over the last month or so.
This covers the users experience and how easy it is to interact with your website. Site speed, content layouts and your sites bounce rates and time on site are all factored in. Having low click-through rates for certain keywords linking to certain pages is the first indication that you likely have to focus on these pages with a matter or urgency.
Again, drops in click-through rates would be clearly visible with these updates now running over the last month or so. In many cases the biggest causes for penalties can be quickly identified and corrected, however with certain legacy systems or those websites with many underlying issues, a more holistic approach to the websites design, architecture and content strategy would be recommended.
The overall aim of the Google Algorithm is and will always be around great, relevant content. The best way to appreciate this when compared to your current digital environment is to put yourself in the point of view of your Customers – and perhaps through your own experience engaging with a brand recently:
- Was the site lightening fast?
- Was it easy to use on all devices?
- Did you get a consistent brand experience (i.e. one website) across desktop, tablet and smartphone?
- Did you get the information/pricing/quote you were looking for or did you go for another search listing that provide what you were looking for?
- Was the content clear, concise and well written?
- Was the content easy to consume (i.e. within 1 – 3 screen views/scrolls) and were content heavy pages managed with tabs or accordions for easy consumption?
- Is the site active on social channels, what online reviews do they have (if any)?
Thus from a customers point of view it is clear that the vast majority of brands could look inwards and agree that many of the above questions would identify improvements to make.
To meet the ever changing (and continuing web content quality) Google Algorithm, the fundamental item at your disposal is your website and all other social, review and marketing activities need to promote or support this idea of a core SEOSearch Engine Optimisation strategy.
With the vast majority of traffic being routed through Google, your brands market opportunity is ever increasingly being driven by the quality and relevancy requirements of the Google Algorithm.