Ecommerce teams, you know you’ve got it tough. With limited resources and a barge load of data, you’re expected to maximise on-site conversions and ensure your site’s UX is tip top, using nothing but your wits, experience and a few software tools.
But you know this already, the reason you bother reading blogs like this is, shock horror, you may, quite possibly, actually want to hear tangible solutions to the challenges you face.
Sit tight, we got your back, in this three part blog series, we’ll cover how your team can navigate some of the most pressing issues in ecommerce holding you back.
Challenge 1: Adopting the right approach to A/B testing
"STOP A/B TESTING, YOU'RE WASTING YOUR TIME!"Matt Henton, Head of ecommerce, Moss Bros
At our Future:Retail rebel conference a few months ago, washing machine enthusiast Matt Henton, provoked a partisan ecommerce crowd with this quote during his presentation. It was of course, largely tongue in cheek but there was a point to be made. Matt inferred that most teams waste a hella’ lot of time testing absolutely everything and instead, brands should better prioritise the tests they’re running and/or just implement the tests where they're very confident of a win.
Matt implored teams to just fix ‘“the broken shit” on their sites. A quick win is to check your 404 logs and see if there’s reoccurring issues. If there’s a particular 404 url that’s getting visited multiple times then just fix that issue before it becomes a problem and affects your revenues/long term user experience. Matt extolled the need to really understand what users are actually doing (where they’re getting frustrated and clicking multiple times on a particular ‘chunk’ of content for example).
What’s holding many ecommerce back is the fact they struggle to quickly answer the below questions:
- How much money did my homepage hero banner generate this week and is that more or less than last week?
- Does my burger menu or my search bar drive higher conversion on my mobile site?
- Why is the basket to checkout dropout so big?
Why are my returning visitors struggling to fill a particular form?
Businesses facing these questions often rely on traditional analytics for answers. But traditional analytics can only tell you what customers are doing on your site, not why and how they're doing it. Some have used session replay tools to attempt to understand behaviour, but the same frustrations arise. Brands should look for tools that display aggregated user journeys visually, enabling them to understand why customers are leaving their site as well as measure the revenue and behavioural contribution of any ‘block’ of content. There’s a huge need to understand your golden or broken customer journeys, feed actionable insights to test hypothesis and recognise why tests are winning or inconclusive.
Mud wrestling with HiPPOs
In every organisation a powerful and dangerous animal lurks. The HiPPO.
Does this HiPPO remind you of a certain someone?
The HiPPO (highest paid person’s opinion) effect could have a detrimental impact on testing ideas and raises the common challenge in ecommerce teams whether to test based on The Data or based on Opinion.
How many of you typically test an hypothesis for one or both of the below reasons?
“Because I had an idea and I wanted to see if it worked...”
“Because my manager told me to...”
More common than not, testing roadmaps are centred around the above rather than those who are using data to drive, back and explain your testing. Digital teams need to come to a single source of the truth. Your initial approach to testing could be one of many; your trade team, your marketing team, your CEO, or worse, the “that’s just how we’ve always done it” mantra.
Data v opinion
Testing is tough, and people are used to doing things a certain way, so it’s hard to convince teams to change their approach. But as a digital team, the reality is that your job is to state facts, not opinions or biased hypotheses. Every brand will have preconceptions about their customers and what they want. Ecommerce teams across all industries need to develop a culture of replacing preconceptions with data.
Teams should trust the data more, user behaviour is super nuanced, numbers alone won’t uncover usability issues. If you’re blindly following good performance tests, this can lead to a cycle of chasing quick fix features and lead to a disjointed product. The questions arise:
How do you measure test success?
Is conversion rate always the appropriate metric?
More than Conversion Rate Optimisation?
Success is more than just CRO, as you want users to return and purchase on your site time and time again. The right approach is to balance CRO with solving users problems creates long term value. In an ideal world, you’d want to run quantitative and qualitative testing at the same time, but how many brands have the time and resources to do this?
Instead, set metrics goals before launching a test and then report on these alongside the traditional ‘win/loss’ view. You’ll need to ensure you don’t get addicted to small win’s – they’re important and the cumulative effect can be significant but you don’t want them to be a barrier to innovative product work that can delight your customers.
Louise Vallender (Head of Ecommerce) and Steve Thomson (Head of User Experience) at Dune Group believed that all too often, classic mistakes retail ecommerce teams can be avoided. So much so they partnered with ContentSquare to highlight how brands across every industry can resolve these challenges quickly.
- Combating common barriers when creating your optimisation strategies
- How to accelerate your speed to insight and transform your testing culture
- How to influence your organisational culture to become data-driven
If you want to hear answers (or at least potential solutions) for the above, join us on August 22nd at 4pm GMT.