In the fast-paced world of web development and digital marketing, creating new features to enhance user experience is not just a common practice, it's pretty much par for the course. Everybody has ideas about making the website better, and it usually involves some sort of magical feature that will help users find the exact product they want in a mere matter of seconds - a shopping genie, or something involving AI. The new feature for your website is almost always visually appealing, interactive, and the designers are confident it will boost user engagement with their favourite persona. Before rolling it out, you wisely decide to run an A/B test to measure its effectiveness.
So you code the test, and you run it for the usual length of time. You follow all the advice on LinkedIn about statistical significance (we can all describe it and we all have our own ways of calculating it, thank you) and getting a decent sample size. The test results are in, and they’re a mixed bag. On one hand, the new feature is a hit in terms of engagement. It receives twice as many clicks compared to the other clickable elements on the page, such as those lovely banners, the promotional links and the pretty pictures. However, your deeper dive into the data reveals a concerning trend. While the new feature attracts a lot of attention and engagement, the conversion rate for users who interact with it is only around 2.5%. In contrast, the conversion rate for users who engage with the existing content on the page is significantly higher, at around 4.1%.
This is key. It really is insufficient to look only at engagement data (click through rate) as a success metric. Yes, it is important, but it is not enough. After all, if you want to create a banner with a high click rate, then you could simply write "Buy one get one free", or better still, "Buy one get two free - click here." It's essential that you set expectations with your banners, calls to action and features - what's to stop you from writing "Click here for free money!"? If your priority in testing is to generate clicks, then you'll degenerate into coding the on-site versions of clickbait, and that's a terrible waste of a potential lead.
So, what went wrong with your test? The short answer is that you coded a pretty distraction. Here’s a breakdown of why this happens and how to address it:
Misalignment with User Intent
The new feature, despite being engaging, may not align with the primary intent of your users. If it diverts their attention away from the main conversion paths, it can reduce overall effectiveness. Users might be intrigued by the new feature but not find it relevant to their immediate needs. You misunderstood your persona's motivation; it might be time to write your persona with this additional information.
Cognitive Load
Introducing a new element can increase the cognitive load on users. They have to process and understand this new feature, which can be mentally taxing. If the feature doesn’t provide immediate value or clarity, users might get distracted and abandon their original task. They used up their time, effort and patience while interacting with your new feature, and gave up on their primary purpose (which was to buy something from you).
Disruption of User Flow
A well-designed website guides users smoothly towards conversion goals. A new feature that stands out too much can disrupt this flow, causing users to deviate from their intended path. This disruption can lead to lower conversion rates, as users get sidetracked. How do I get back to my intended path? This new feature has proved to be the next big shiny thing, and while it's attracting user engagement, it's confusing them and preventing them from getting to where they wanted to.
The Solutions
To avoid coding distractions, consider the following strategies:
User-Centric Design
Not my favourite phrase, since it leads to design without A/B testing and designers designing for their favourite personas. Ensure that any new feature is designed with the user’s needs and goals in mind. Conduct user research to understand what your audience values and how they navigate your site, and then align your new features and your development roadmap with these insights. This will enhance, rather than disrupt, the user experience, and reduce the amount time of wasted on development the next shiny bauble - it looks nice and impresses senior management, but is not good for users.
Incremental Testing
Instead of launching a fully-fledged feature, start with a minimal viable version and test its impact incrementally. This approach allows you to gather feedback and make necessary adjustments before a full rollout. Use test data in conjunction with user research to gain a full picture of what you thought was going to happen, and what actually happened.
Clear Value Proposition
Make sure the new feature has a clear and compelling value proposition. Users should immediately understand its purpose and how it benefits them. This clarity can help integrate the feature seamlessly into the user journey. If the test 'fails', then you'll learn that the value proposition you promoted was not what users wanted to read, and you can try something different.
Monitor and Iterate
Continuously monitor the performance of new features and be ready to iterate based on user feedback and data. If a feature is not performing as expected, don’t hesitate to tweak or even remove it to maintain a smooth user experience. It's time to swallow your pride and start again. If you change direction now, you'll have less distance to travel than if you wait six months before unimplementing the eye-catching blunder you've launched.
Conclusion
In the quest to innovate and improve user engagement, it’s crucial to strike a balance between novelty and functionality. While new features can attract attention, they must also support the primary goals of your website. By focusing on user-centric design, incremental testing, and clear value propositions, you can avoid coding distractions and create features that truly enhance the user experience.
Other Web Analytics and Testing Articles I've Written
How not to segment test data (even if your stakeholders ask you to)
Designing Personas for Design Prototypes
Web Analytics - Gathering Requirements from Stakeholders
Analysis is Easy, Interpretation Less So
Telling a Story with Web Analytics Data
Reporting, Analysing, Testing and Forecasting
Pages with Zero Traffic