Why Don’t Companies A/B Test Their Websites?

No comments yet

Does your website have lots of obvious faults and usability problems that you need to fix? Are you fully aware of what changes need to be made to improve the performance of your site? Do you lack some of the functionality that many of your competitors have on their websites? Have you employed a website design expert?

These are some of the many reasons I have come across why companies don’t conduct A/B testing on their websites. Despite the proven benefits of A/B and multivariate testing many websites that have sufficient traffic and conversions continue to make changes without the use of online experiments. Let’s examine the logic behind not testing.

Design Faults:

Amazon.co.uk homepage

Your website has lots of design faults and usability issues. Show me a website that doesn’t? There is no such thing as a website without usability and design issues. Even Amazon and Google have issues with their websites. But they use A/B tests to understand how changes influence visitor behaviour and their conversion rate.

This is the nature of any choice architecture. Compromises have to be made when designing any website. The main unknown factor here is how these decisions influence visitor behaviour and conversions.

So what I recommend is conduct a user experience audit to fix the main problems as this is certainly wise before beginning running experiments. If necessary hire a conversion rate optimisation consultant to mange this process. But if you have sufficient traffic and conversions on your site you should then consider A/B testing as part of a conversion rate optimisation programme.


image of rainforest

Source: Freeimages.com

Business people like to hire ‘experts’ because we don’t always trust our own instincts and we are prone to authority bias. This means that we give more weight to advice from people in authority as we assume they are more accurate in their predictions. It also reduces the risk that we get blamed for a decision if we can point to it being based on expert advice.

Website design experts can review your site and make recommendations based upon experience and what has worked in the past. They can employ best practice principles and obtain visitor feedback.

However, given that every website has its own unique digital ecosystem it is impossible to predict how a change will impact user behaviour and your conversion goals. Measurement and testing should be employed to support and validate this approach as otherwise you may not get the improvements in revenues that you anticipate.

A/B testing allows the impact of proposed changes to be measured because it splits traffic randomly and employs a control (the existing webpage) to identify the difference in the performance of the alternative user experience. This provides the confidence that any statistically significant difference in the conversion rate between the two experiences is unlikely to be the result of changes in other factors (e.g. the quality of traffic, competitor activity or the weather) or even just down random variance.

This scientific approach is essential for digital marketers as I have come across many instances where a clearly inferior design from a usability perspective results in a higher conversion rate. Stakeholders may still want to consider improving the user experience, but unless you run a controlled test you won’t be able to quantify the true costs or benefit of such a change.

Improved Functionality:

image of laptop and shopping cart

Source: Freeimages.com

Improving the functionality of your website unless relevant and implemented appropriately may not have the desired outcome. It may be a distraction or change visitor behaviour in an unexpected way that could damage conversion.

In many instances A/B testing can allow you to create new functionality without having to first build and code it internally. This means you can test and quantify the benefit against the cost of development before committing valuable and scarce resources to delivering something that may have limited or no impact on revenues.

Low Traffic:

image of a single car on a road

Source: Freeimages.com

Even if your site has low traffic or a tiny conversion rate there are still strategies that may allow you to benefit from A/B testing. In his article how to test and improve your website if your traffic is too low for A/B testing, Rich Page suggest setting goals that take visitors towards your primary conversion objective, such as clicks on your call to action or engagement on the page.

I was asked to improve conversion on a mobile website with relatively low traffic and small conversion rate. After reviewing the customer journey I came up with two test ideas, one for new registrations and one for existing users. The first test increased first-time deposits by around 50% and the second test improved revenues by 26% or over £500,000 a year. So don’t give up on your low traffic sites.

We Understand Our Customers:

Image of lady lying on the ground next to laptop

Source: Freeimages.com

Sure, you may have a good handle on who your customers are and why they come to your site. But human behaviour is too unpredictable to make changes to your website based purely upon subjective opinions and gut instinct. There is no such thing as a neutral choice architecture – every aspect of your website can influence visitor behaviour and not necessarily in the way you expect it to. People can’t predict their future behaviour and neither should you try without scientific evidence.

We Optimise Our Site By Improving Content:

image of US $100 notes

Source: Freeimages.com

I often hear people say we are “optimising a page” by improving the content or creating a new design based upon their subjective opinions. This is not optimisation, it is risk taking and it could cost your business millions in lost revenues. Changing content based upon subjective opinions is a recipe for disaster as how do you know you are improving site performance without having a control in place?

Conversion rates are constantly rising and falling due to many factors from traffic source, competitor activity, the weather and time of day. This means that you can’t rely on a before and after analysis to measure the impact of changes you make to your site. A/B tests have a control group (i.e. the existing design experience) to allow an accurate comparison of the performance of different designs. Even then most tests fail to generate a significant uplift which demonstrates how difficult it is to predict the impact of new content on a website.

Test Culture:

image of the word Test - A/B testing

Source: Freeimages.com

A/B testing is most effective when there is a culture of evidence based decision-making in the organisation and managers encourage people to take risks and try new ideas. Unfortunately as people gain experience psychologists have shown that we become over-confident in our ability to predict outcomes and so many of us rely on our gut instincts. This may be fine with situations that are familiar to us, but the consequences of changing the design of a website is very unpredictable and prone to leading to unintended changes in behaviour.

People are also often obsessed with their competitors and wrongly assume they have based their website design on what works or may have even A/B tested key pages themselves. This is a risky strategy as all too often your competitors have no better understanding of what works than you do. Further, this approach will result in your website looking similar to your competitors, thus reducing your ability to differentiate your brand.

Testing challenges this kind of culture and can create political power struggles in organisations where managers normally make decisions about website changes based upon their expertise and knowledge. Website optimisation is really a form of change management and innovation which requires a certain kind of culture to encourage it to flourish.


In the book The Wisdom of Crowds, the author James Surowieckie uncovers substantial evidence to show that individuals, small groups, and ‘experts’ in particular, are very poor at predictions and forecasts. Large, diverse and independent thinking crowds are much better at such decisions. That is why A/B testing is so powerful because it also harnesses the crowd (i.e. your visitors) to tell you what works best.

Recommended reading:

Landing Page Optimization: The Definitive Guide to Testing and Tuning for Conversions

Landing Page Optimization: The Definitive Guide to Testing and Tuning for Conversions

For more of our blogs visit conversion-uplift.co.uk/post/.