When I build a UX audit for a client, I look at these things:
- The “what” – what is the user doing, where do they go, where have they come from? For this I look at Google Analytics (GA) & heatmaps
- The “why” – why is the user doing that? Heatmaps, livechat archives plus my own opinion based on experience/ benchmarks & standards
- I also look at speed tests using Webpagetest, GTMetrix, PageSpeed Insights, Testmysite with Google
The client has asked for an audit because they need your expertise. They’re probably so close to it they can’t see the wood for the trees anymore – they need a neutral, professional opinion, based on facts.
Where do you start?
Start with the main goal – the most profitable journey. What’s going on? Check out the funnel where users are leaving.
How are users accessing the website? Is it from social or PPC, what does the data look like for audiences (bounce/exit rate, time on page)? Find the specific ads and banners the user is seeing so you can get a full, holistic view of the full journey (don’t just start on the website – that’s not where the user’s journey has started so it shouldn’t be where the research starts).
Of course, some of the data will be “vanity metrics” – such as ‘time on page’ where it’s pretty much impossible to understand whether a short time or long time is a sign of a good or poor experience. Similarly, a high bounce might look like the customers aren’t engaging with the page (therefore a bad experience), but considering it’s a blog page for a specific need, the user might read the full article but not click – therefore, getting what they wanted, being satisfied but not wanting to continue to look around.
Heatmaps show what users are clicking on and how much of the page they’re seeing. Use them with caution; sometimes in Hotjar, it looks like an element has had multiple clicks but really when you drill down in the data, it doesn’t feel as worrying as it once did. Also be careful with sampling; ensure the tool you’re using doesn’t collect all results at the same time (this then isn’t sampling!) If you collected all data on a Saturday, this isn’t giving a general view on audience behaviour by collecting across multiple days. It’s why A/B testing is great for understanding specifics about what works best; it’s done at the same time rather than at different times/days, when audiences moods and goals are totally different.
Are the users all new or returning? If a high percentage are new, then extra care should be taken to include trust & credibility markers. New users aren’t likely to know your brand, so you need to display your business is credible by adding elements such as reviews (from trusted companies e.g. TrustPilot), awards won, accreditations etc.
How are users visiting the site – desktop, mobile? This will help figure out where to start your research.
If a website isn’t quick to load then you might as well say goodbye to a high percentage of users who don’t have the time or patience to wait whilst your blog images or forms load! Largest Contentful Paint is one of the new metrics being used to determine page speed – it’s what the user sees as a loaded page, rather than the actual time, which might be longer but the user doesn’t care about that, because it looks loaded to them!
Google take into consideration your website’s page speed, so ensuring it’s fast is imperative for good search rankings.
You need to be careful not to be too critical of your client’s website – you don’t want to offend. Ensure all things mentioned are facts rather than opinion e.g. instead of “using blue font on a red background is poor design”, include a colour contrast result to steer the client in the right direction. Or you could include positive points throughout the presentation too.