What is conversion rate optimisation?

Blog banner

Conversion Rate Optimisation (CRO) is the process of optimising web pages and/or page elements to increase conversion rates. This normally involves running A/B tests or split tests with two different versions of a page competing against each other. Traffic is divided equally between the two variants to see which version achieves the highest conversion rate, once statistical significance is reached.

That last point about statistical significance is important and it relates to the biggest mistake brands make with conversion rate optimisation.

Conversion rate optimisation is a data-driven strategy

Conversion rate optimisation is a data-driven strategy which means you need good data going into your tests and good data coming out of them.

Before you dive into testing, make sure you have the following in place:

  • In-depth conversion data: Conversion rates alone won’t help you to pinpoint what needs testing. You need in-depth data for the actions users are (or aren’t taking) on your site. Use heatmaps, events measurements in Google Analytics and tools like form analytics to pinpoint issues getting in the way of conversions.
  • Trends: With the right data coming in, you’ll start to see patterns that reveal opportunities for testing – for example, only 60% of users who start filling out your forms complete them successfully.
  • Hypotheses: For each trend, you need to come up with a hypothesis to explain what’s happening. Try not to guess; dig deeper into your data and aim to diagnose what’s causing the issue.
  • Test goals: Before you run your test, define what your goal is and pinpoint which KPI measures success – eg: increase form completion rate to 90+%.

Too many brands and marketers jump into conversion optimisation without having the right data processes in place – and this is setting yourself up for failure. Poor data delivers unreliable results and potential false negatives that could cause more harm than good to your conversion rates.

Running your first A/B test

When it comes to running A/B tests, the biggest challenge is making sure you achieve results you can actually trust. This is where statistical significance comes into play, which describes the reliability of your test outcomes. Ideally, you should be aiming for around 97+% statistical significance and anything under 95% starts to compromise your results.

When the test comes to an end and you have a clear indication there is a winner out of the two variants, you should break down your conversion data into specific segments like traffic channels, device category, and user demographics. This will help you when reporting on your conversion rate because it will reveal specific user types that are converting at a high percentage, or users of a different type that are not, which will offer you invaluable insights into your digital product performance and areas that need improving.

Here some things to keep in mind:

  • Only test one variable to begin with
  • Choose a large enough sample size
  • Split traffic 50/50, at random
  • Run your test long enough to achieve statistical significance
  • Run your test long enough to mitigate variables (random spikes in conversions, the holiday season, unusually hot summers, etc.)

Conversion rate optimisation is an ongoing strategy that turns data into better business results – as long as it’s done correctly. If you have any doubts about your CRO strategy or want to know more about getting started, call us on 02392 830281. Or you can learn more about our CRO services here.

Billy Farroll profile picture
Billy Farroll

Billy has worked in the Digital industry for three years now, specializing in User Experience. Studying a Digital Media Design degree at Bournemouth University and having always worked with computers from a young age you could say digital was always destined for him in his career. Billy is most passionate about User Experience above all over areas because he can see a true value in this specialism, which he conveys to all who he interacts with. Other than working on understanding users’ needs, emotions and what makes them select one button over another, Billy enjoys keeping himself fit, playing sports and traveling. He started as a Performance UX specialist working on many different campaigns that involved and required a variety of items, in terms of designs, testing and research. Now he heads up the Performance UX team, overseeing all the campaigns that are in the portfolio for this service. His goal and ambition are to grow the team into the biggest service Vertical Leap have, he sees this as a reachable target by continuing to deliver results and increase conversions for clients. From Southampton, currently lives in Southampton and likes exercises and sports, especially boxing and football. He supports Southampton FC.

More articles by Billy
Related articles
Pineapple in a field. Because, why not?

5 steps to better CTA (call to action) design

By Wez Maynard
9 reasons your website doesn’t work on mobile

9 reasons your website doesn’t work on mobile

By Tom Light
What is performance UX and why should I care?

What is performance UX and why should I care? 

By Wez Maynard