How To Optimise Conversions Using A/B Tests
When people talk about A/B tests they often refer to call to action button changes and landing page tests. They also sometimes talk about only changing one element on a page at a time to ensure you can tell exactly what generated the difference between the two experiences. This last point of view can be quite misleading and could hold back your optimisation programme.
Your A/B tests should be based on a best practice and systematic process of discovery, evidence and prioritisation. But once you have that in place you also need to consider how to build a test plan for each of your key pages or journeys. Begin by prioritising where to test and then you can consider what kind of tests you should run.
This brings us to the question of what are the main types of A/B tests that you should be including in your testing roadmap. I’ve outlined below six testing approaches to consider and you should be employing all of them to optimise your site and improve conversions.
1. Innovation A/B Tests:
Unless you happen to work for Google or some other mega website you have to change more than one element at a time if you are to make quick progress in your optimisation journey. Innovation or re-direct A/B tests allow you to experiment with something completely different. This gives you the opportunity to ensure the new page is more aligned to your business goals. The idea is that you can leave all the baggage of the existing page behind and design a radical new experience that will allow you to leapfrog to a much higher conversion rate.
Find an important web page, one with lots of traffic and a conversion rate that you believe can be significantly improved upon. You can then use a heuristic evaluation of the page to identify areas for improvement and use the other stages of the optimisation process to gather further insights to help you construct your new innovative design.
As the design is radically different from your existing page you may want to manage the risk of running A/B tests on this page. You could begin the test by only sending a relatively low proportion of traffic to the new variant. Once the test has been running for a few days and you haven’t seen a big drop in your conversion rate you can increase the proportion of traffic to the variant to reduce the time it will take for the test to complete. However, be careful with this approach as it could result in Simpson’s Paradox where the overall conversion rate hides differences between sub-groups.
2. Optimise and Multivariate Tests:
Once you have found a new innovative design that performs better than you existing page you should look to dissect it to understand how you can further enhance its effectiveness. Provided you have sufficient traffic multivariate testing (MVT) could be used. Unlike A/B tests, MVT allows you to change content within multiple sections of the same page and compare all the possible combinations against each other. For example if you wanted to test changing two sections on a page and have two variables for each section that would generate 8 combinations.
2 x 2 x 2 = 8
However, adding just one more variable in a single section increases the test combinations from 8 to 12.
2 x 2 x 3 = 12
MVT’s have the advantage that they allow you to isolate many small page elements to understand their individual impact on conversion. You can also evaluate interaction effects between multiple independent elements to find compound effects. This can save you time as you don’t have to create and test many different variations for a page element that might not even have much impact upon your conversion rate.
On the downside MVTs require more traffic to achieve statistical confidence than A/B tests. If you don’t have the traffic to support a complex MVT limit the number of combinations or conduct a series of A/B tests instead. With MVTs you need to ensure that all variations within each section make sense together. Once the MVT has identified which page elements contribute most to conversion you should validate the winning combination using A/B tests to check that they deliver the promised uplift.

Image Source: Nick Kolenda
3. Real Estate A/B Tests:
Although you may now have a high performing page, how do you know that all the elements on the page are in the best location? Some of the elements on the page could be poorly performing from a conversion perspective because they are in a sub-optimal location. Perhaps your main call to action is too far down the page or testimonials are taking prime real estate above the fold and they would be equally as effective further down the page.

Image Source: ClickTale.com
Never assume that elements are in the best locations. Your visual analytical tools, such as click and mouse movement heatmaps, should provide evidence that certain elements are not getting the attention you might expect. You can then work with your web designer and developers to create A/B tests that challenge the existing location on the page. Try moving elements to different locations on the page but ensure that the page flow still works as otherwise that could influence the test result.
4. Inclusion/Exclusion A/B Tests:
Is that auto-rotating carousel really improving conversion? This is the stage in your page optimisation process where you start turning off elements on your page to identify the conversion influences. If you remove your carousel from your homepage and you see a positive impact on conversion this tells you that you either have a poorly designed carousel or that you could use that prime real-estate for other conversion influencing assets.
These types of A/B tests are ideal for pages like your home page that have many different elements on them and could benefit from being de-cluttered. Having unnecessary assets on a page can be distracting and reduce engagement at an important stage in the user journey. If an element is removed and there is no impact on conversion this could also be considered for removal or it could be moved to a less important page or location.
When removing an asset that has a negative impact on conversion you know to retain it as showing it clearly improves conversion. However, you should then do follow-up A/B tests on this element to determine the best design for this type of asset.
Be cautious about removing assets that when removed show a positive impact on conversion if the element relates to specific use cases or conversion goals. Maybe the element has been poorly designed or is difficult to understand. If you have any evidence that this might be the case try some A/B tests with different designs before deciding to remove it from the page.
5. Segment and Target Your A/B Tests:

Source: Freeimages.com
If you treat all your visitors the same you can only expect to have an average conversion rate. By definition some of your A/B tests will better meet the needs of certain visitor segments. As a result they may convert significantly higher for one group, but less well for other types of visitors. To further improve your conversion rate you should evaluate how you can segment and target your A/B tests to create experiences designed to better satisfy the needs of individual customer groups.
This approach will also boost your conversion rate because it leads to a much more dynamic website that responds to the needs of different user segments. Set up key visitor segments (e.g. new and returning customers) in your analytics that you want to analyse and target with different content. This allows you to analyse your test results to identify customer segments that performed significantly better than your average conversion uplift. You can then serve your winning test experience to all those visitor types that are more responsive to your new content.
Content automation is increasingly encroaching into this space and although it is a great tool, it is not a silver bullet. You can only automate the content you have and if this is not optimal and engaging automating it will be of limited or no value. You should use A/B testing to identify relevant and engaging content. This can allow you to understand how individual visitor segments respond to different user experiences. This will improve your chances of producing content that benefits from automation and is responsive to customer needs.
6. Test Iteration:
To avoid random and ad-hoc testing you should always base your tests on insights gleaned from previous tests or test additional assets following-on from an initial test. Testing is a continuous process that enables your website to evolve gradually to better satisfy your customer needs and provide new insights to enhance your content marketing. A test and learn process is a much more scientific approach to website improvement than completely redesigning your website from scratch.

Conclusion:
By using these strategies to create a systematic plan for optimising key pages on your site you are more likely to deliver substantial and sustainable uplifts in conversion. Each type of test is designed to provide specific insights and allow you to further enhance your conversion rate.
Never assume you have come to the end of your journey as your competitors will look to respond to your optimisation strategy and disruptive technologies may change customer behaviour. You will need to continue the optimisation process if you want to respond to changing visitor needs.