Conversion Rate Optimisation Strategy Mistakes
Conversion Rate Optimisation Strategy Mistakes
10 Top Conversion Rate Optimisation Strategy Mistakes:
There is plenty of advice on Twitter and other social media about conversion rate optimisation strategy. ConversionXL, Widerfunnel, and Hubspot to name but a few. Despite this many organisations continue to make some basic errors that limit their ability to improve sales and revenues from their conversion rate optimisation strategy. Below are nine of the most fundamental mistakes that organisations tend to make with conversion rate optimisation strategy:
1. Don’t fully integrate web analytics tracking and reporting
The saying that if you don’t measure something you can’t identify if you are improving or not, rings true with website optimisation. Unless you have reliable web analytics monitoring and reporting of your KPIs from the beginning to the end of the user journey. You will never really know how your site is performing and what impact tactical changes have on your revenues. You will also struggle to prioritise effectively as you need web analytics to identify the value of each step in the user journey. Conversion optimisation strategy depends upon comprehensive and reliable web analytics to inform decision making.
They are also important to validate test results and check the robustness of uplifts. A/B testing solutions only support certain browsers and devices and need to be configured to ensure they cover all important use cases. What if your test doesn’t include an alternative user journey? Your web analytics can help identify these kinds of problems so that you can fix them.
2. Conversion Rate Optimisation Strategy = A/B Testing:
Although A/B testing can be a useful optimisation technique, it is only one of many activities that an organisation needs to use for an effective conversion rate optimisation strategy. The chart below shows the many activities companies use to improve conversion rates. Companies that have an effective strategy will do all of these and more. Furthermore, they won’t begin A/B testing until they have completed a thorough user experience audit to identify and fix problems with the customer experience.
3. Rely on Before & After Measures:
This kind of measurement can be misleading because conversion rates continuously fluctuate due to many factors. Competitor activity, website bugs, traffic source, advertising spend and the weather are just a few that can cause your conversion rate to change. Because of this you can only be confident that a change to your website is the reason for a significant uplift or decline in conversion by running an A/B or multivariate test.
These kinds of experiments allow you to isolate the impact of the difference in the customer experience by having control. This is achieved by randomly splitting traffic to both experiences and so all other drivers of your conversion rate should influence both variants equally.
4. Don’t A/B Test.
OK, you’ve fixed your user experience problems. What’s next? Provided you have enough traffic and conversions A/B testing allows you to learn from your mistakes and identify what improves conversion. There are many reasons why organisations don’t conduct A/B testing, but the lack of online experiments can hinder your ability to reduce acquisition and retention costs.
A/B testing enables you to remove subjective opinions from decisions about which design or journey is better at meeting the organisation’s objectives. They also help develop an evidence based decision making culture. Which is key to a successful conversion rate optimisation strategy.
5. Only track a single measure of conversion:
It is beneficial to agree a single success metric for your conversion optimisation strategy. This is especially useful for A/B tests as it provides clear direction to everyone creating experiments. But if your success metric is total revenues or sales leads, that doesn’t mean you should ignore other metrics that could suggest a change is counter-productive. For example if you are optimising to increase sales it would be appropriate to also measure average basket value and total revenues to understand how this affects overall profitability. For a conversion rate optimisation strategy to be sustainable it needs to improve long term profitability and not just short-term sales. This means having a long-term vision and suitable metrics to target.
For ecommerce this means monitoring metrics such as average order value, number of items per basket, sales from returning customers and returns. You will then get a better understanding of how the new customer experience influences user behaviour and your bottom line.
A High Bounce Rate
With content marketing a high bounce rate is often seen as an indication of low engagement. But because of the way most web analytics calculate bounce rates and time on page this may not be the case. Google Analytics defines a bounce as a single engagement hit and counts the session time for such a visitor as zero. What if some of those visitors are spending a number of minutes engrossed in a post and then exit your site? Are they not engaged?
To understand true levels of engagement you need to also track how long bounced visitors spend on a page. This can be done by adding some extra script to your GA tag and setting up events in your web analytics. The point here is that no single metric will ever give you the whole story and it is essential to delve deeper into customer behaviour to truly understand the impact of changes you make to your site.
It is also essential to segment metrics as there is no such thing as an average customer. Device, browser, new visitors and returning visitors are all metrics that can significantly influence how your conversion rate performs. It’s important to analyse the conversion rate by these metrics as otherwise you could draw the wrong conclusions.
6. Don’t have a dedicated team for CRO.
Without a dedicated conversion rate optimisation ( CRO) specialist (or a team in larger enterprises), you will not achieve the full potential from optimisation because generalists will struggle to develop the necessary skills or allocate sufficient time to the task. CRO requires specialist skills (e.g. web analytics and heuristic analysis) that take time to acquire and benefit from regular updating.
Developing strong hypothesis for testing is also a time consuming process. As your A/B testing programme matures you may notice that between 50 to 80% of tests will fail to generate a significant uplift in conversion. As a consequence you will need to run more tests to generate a reasonable return on investment (ROI).
Marketing generalists should be able to deliver landing page and other tactical tests, but they are unlikely to have the time or expertise to develop a more strategic optimisation roadmap that is required to achieve the full benefits of CRO. Generalists also often fail to develop strong hypothesis or have the time to build more complex tests as their time horizons may be too short.
Strong Test Ideas
It is essential to have a continuous supply of strong test ideas in your pipeline to achieve the necessary scale of testing required for a good ROI. A centralised CRO team can easily allocate the necessary resource for the development of test ideas and ensure priority is given to websites or pages with the most potential for generating a high ROI. This minimises duplication of effort and facilitates the sharing of test results with all CRO specialists in the organisation.
A fragmented or silos based approach to CRO is prone to failure because of its inefficient use of resources, often resulting in duplication of effort, and a focus on tactical rather than strategic optimisation. A lack of co-ordination and control of CRO also tends to prevent the implementation of a structured approach to optimisation as silo develops its own ad-hoc processes and KPIs. This is generally a recipe for disaster and a reason why CRO will fail to deliver a good ROI.
7. Put junior people in charge of optimisation.
Source: Freeimages.comA/B testing is a form of experimental research and as such should be seen as part of your innovation strategy. It needs to be headed up by a senior person to deal with all the obstacles that prevent change in an organisation. A junior person is unlikely to have the clout to deal with office politics, and almost certainly won’t have the authority to optimise product, sales channels, Customer Services or prioritise development projects.
This is something that few companies get, for website optimisation to achieve its true potential you need to look at the whole customer journey, and optimise all the inputs, not just the new customer sign up to buy process. Look at the companies that excel at optimisation. Organisations like Amazon, Spotify, Skyscanner and Netflix, they all have directors or senior managers in charge of their testing strategy and don’t limit themselves to new customer journeys.
If you you don’t have a senior role in your organisation for conversion rate optimisation consider hiring a conversion rate optimisation consultant. They can review your processes and ensure your conversion rate optimisation strategy is on solid ground.
8. Don’t formulate hypothesis.
When generating ideas for A/B tests it is important to base the experiment on a hypothesis about how and why the change will influence user behaviour. A hypothesis explains the rationale and also predicts the outcome of the test so that you know which success metrics to set for the test. The hypothesis needs to be based upon evidence gathered from an agreed optimisation process rather than pure gut feeling as otherwise you may struggle to learn from successful tests. Without strong hypothesis A/B testing becomes a random and undirected process that will fail to generate the full benefits of CRO.
9. Don’t have a clear strategy for testing.
There is no point relying on low hanging fruit and best practice to direct your A/B testing as these sources will soon run dry and you will lack direction in your testing programme. It’s important that you follow a recognised and structured optimisation process that draws insights from a range of sources, especially from customers.
And yet companies are often more concerned about competitors and copying their ideas than listening to customers. This is a serious mistake and will lead to a sub-optimal testing programme. Customer insight and usability research is vital because to develop strong testing ideas you need to have a good understanding of customer personas, goals, tasks that lead towards goals and how users interact with your website or app.
Otherwise how can you expect to develop hypothesis to predict user behaviour? You could be making assumptions about customers which might not have any basis in reality. The more insights you can get from your customers the greater the chance you have of identifying a significant problem or improvement you can make to improve conversions.
10. Think it’s all about design.
I’ve heard this so many times, but do your visitors really come to your site to look at its design? I don’t think so. People come to your site to complete a task and are rarely interested in your “cool” design. In fact most conversion rate experts agree that all too often ugly wins over beautiful designs.
Just look at Amazon.com and ebay.com, none of them are what anyone would call aesthetically great designs. They are functional, they offer a great deal maybe and most importantly of all they let users do what they want to do without having to think too much. Conversion rate optimisation strategy must focus on the customer first and not the subjective opinions of designers.
Designers may be good at composing a new webpage or app screen, but that doesn’t mean they understand your main customer segments or know what improves conversions or revenues. Conversion optimisation strategy requires a collaborative process and so designers must work closely with CRO experts to deliver new experiences based upon evidence rather than subjective opinions. Otherwise you will end up with new experiences that are based upon design principles rather than CRO insights and there will be limited, if any, learning from the process.