Firstly, you need to put a meeting together with your stakeholders. Discuss what you’d like to test, put a plan in place for those tests based on opinion and previous experience….
Sorry to pull out the GIF but this really needs to sink in!
A/B testing should be based on data from your customers. Data such as:
– customer feedback (qualitative)
– cold hard stats (quantitative)
For a client I worked with, the CEO had his own list of A/B tests he wanted to do. He was a lovely chap. He thought he knew what would work based on the massive amount of experience he had. But unfortunately, it didn’t work. He isn’t the target audience and was unable to put himself in the customers shoes. I run his tests and they all flopped – everyone wondered why. It was gutting – they were really great tests which should have worked.
We all wondered if there was a problem with the A/B testing software? Was it the time of month? Was the audience right?
We hadn’t based the test on what our customers were saying.
The test wasn’t talking to the customer in a way that resonated with them.
Once the CEO saw how the tests didn’t work, he agreed for me to do customer research – awesome!
I looked around at where I could get customer research from and I found Hotjar. Hotjar allows you to add a poll to a website, on any page, to ask customers questions.
We asked the customer if they had any reason for not continuing on their journey of applying for this particular product. The answers we received were eye-opening and SO useful.
What issues does the customer have?
The customers were looking to apply for a loan. They were concerned and wanted to know if it was a soft search – basically would applying affect their credit score. It wouldn’t, so we needed to make this clear. This was instantly something we could address in the copy – we could reassure the customer.
Another example I can share is we were saying ‘poor credit’ but the customers were talking ‘bad credit’ – this was instantly another idea to put in the list.
We A/B tested the above 2 ideas and increased conversion on customers applying by 17%.
17%! That’s huge!
Not only was I keeping management happy but most importantly, the tests were making the pages clearer and more transparent to the customer helping them make a decision on whether or not to apply.
This just goes to show that no matter how good your ideas are, if you come up with a list of A/B tests yourself (or with stakeholders), the odds are against you. You need to have empathy for your customer.
- How are they feeling at that moment?
- Are they worried, do they have questions you could potentially answer in the copy?
- Are you being clear enough?
- Are you being TOO clear and including too much copy which is turning them off / confusing them?
All of these questions you can get answered with customer feedback. Customers are all around you! Ask your family, your friends for their opinion. Walk into a coffee shop and ask people for their opinion in return for you buying them a coffee.
No matter how small, people will always have an opinion. This is your chance to uncover those opinions to make your website better! It’s not just about making revenue or having a test hit significance. It’s about improving the journey for the customer.
Questions to always ask yourself – are you putting the customer first? Is this the clearest it can be for the customer?