(Updated February 2021) Until about 2014, I had never had any websites hacked, however, hacking…
Back in 2020, Google announced that they would be making User Experience (UX) more important in ranking search results and that they were introducing some new ranking factors to support this.
Google generally are quite secretive about what they use in their algorithms to calculate ranking positions so this early announcement caused quite a stir in the search industry and stressed how important it could be.
In November 2020, Google further announced that their aim was to introduce the new ranking factors in May 2021.
For now, there are 3 new Core Web Vitals (CWV) factors which measure how users perceive the experience of interacting with a web page and these combine with a number of other existing factors.
These 3 new factors are the focal point for Google initially however it is likely that Google could add or change these metrics as they evaluate these over a longer period of time.
This CWV topic gets quite complicated and technical so I will try not to go into too much detail here but in “non-techy” terms, the 3 new factors are as follows.
- How long does it take before you can see the largest part of the content (e.g. a banner image) displayed when you visit a particular web page ? (techy term is “Largest Contentful Paint” (LCP)).
- How long does it take after you see an interactive element such as a button on the screen before you can actually click it ? (techy term is “First Input Delay” (FID))
- Do some page elements like text and images shift their positions shortly after loading ? (techy term is “Cumulative Layout Shift” (CLS))
In a nutshell, users don’t like waiting for a web page to load so the faster it loads the better and the more likely they are to stay on your website. Remember also that speed will typically be slower on a mobile than on desktop and Google uses mobile for indexing now.
There are a number of tools now available in the industry to test website speed on different platforms and under different conditions. This is a complicated task and is made even more difficult by the fact that each tool works differently and also each time you run a test (even with the same tool) you can get different results because there are many variable factors (e.g. quality of hosting, server locations, varying internet connection speeds, server loading etc). The general approach is to use multiple tools with multiple runs and compare results to get a feel for realistic values.
Some of these tools suggest areas to look at to improve the problems which is very helpful although not all such items can be addressed since they are just the way the software (e.g. the theme or plugins) have to work. The quality of the code in the theme and plugins is very important and can have a significant effect if it is written badly. Also, the more plugins you have installed in the site, the more complex it becomes as each one adds more code.
Analysing a website to identify potential areas for improvement can be a lengthy process. Fixing issues is a trial and error process to a great extent, testing, identifying items to address, trying a fix and testing again to compare results. Some fixes can produce global site improvements but other may not. Some can result in a significant change, others very minimal or sometimes can actually make things worse. The best approach is to try the fixes that potentially return the best improvements first and then assess if it is worth investing more time and effort to get further small enhancements.
The other point to remember is that each individual web page has to be analysed separately so this can involve a lot of work.
Clearly the new Core Web Vitals is going to have a significant effect in search rankings so you need to look at how your website is performing and what you need to do to improve it. If you don’t, your current rankings could drop and your competitors could overtake you on the search rankings, particularly if they are doing it and you aren’t.
If you need some help, give us a shout !