Header tag

Wednesday 28 November 2018

The Hierarchy of A/B Testing

As any A/B testing program matures, it becomes important to work out not only what you should test (and why), but also to start identifying the order in which to run your tests.  

For example, let's suppose that your customer feedback team has identified a need for a customer support tool that helps customers choose which of your products best suits them.  Where should it fit on the page?  What should it look like?  What should it say?  What color should it be?  Is it beneficial to customers?  How are you going to unpick all these questions and come up with a testing strategy for this new concept?

These questions should be brought into a sequence of tests, with the most important questions answered first.  Once you've answered the most important questions, then the rest can follow in sequence.
Firstly:  PRESENCE:  is this new feature beneficial to customers?
In our hypothetical example, it's great that the customer feedback team have identified a potential need for customers.  The first question to answer is: does the proposed solution meet customer needs?  And the test that follows from that is:  what happens if we put it on the page?  Not where (top versus bottom), or what it should look like (red versus blue versus green), but should it go anywhere on the page at all?

If you're feeling daring, you might even test removing existing content from the page.  It's possible that content has been added slowly and steadily over weeks, months or even longer, and hasn't been tested at any point.  You may ruffle some feathers with this approach, but if something looks out of place then it's worth asking why it was put there.  If you get an answer similar to "It seemed like a good idea at the time" then you've probably identified a test candidate.


Let's assume that your first test is a success, and it's a winner.  Customers like the new feature, and you can see this because you've looked at engagement with it - how many people click on it, hover near it, enter their search parameters and see the results, and it leads to improved conversion.

Next:  POSITION:  where should it fit on the page?

Your first test proved that it should go on the page - somewhere.  The next step is to determine the optimum placement.  Should it get pride of place at the top of the page, above the fold (yes, I still believe in 'the fold' as a concept)?  Or is it a sales support tool that is best placed somewhere below all the marketing banners and product lists?  Or does it even fit at the bottom of the page as a catch-all for customers who are really searching for your products?


Because web pages come in so many different styles...
This test will show you how engagement varies with placement for this tool - but watch out for changes in click through rates for the other elements on your page.  You can expect your new feature to get more clicks if you place it at the top of the page, but are these at the expense of clicks on more useful page content?  Naturally, the team that have been working on the new feature will have their own view on where the feature should be placed, but what's the best sequence for the page as a whole?  And what's actually best for your customer?

Next:  APPEARANCE:  what should it look like?
This question covers a range of areas that designers will love to tweak and play with.  At this point, you've answered the bigger questions around presence (yes) and position (optimum), and now you're moving on to appearance.  Should it be big and bold?  Should it fit in with the rest of the page design, or should it stand out?  Should it be red, yellow, green or blue?  There are plenty of questions to answer here, and you'll never be short of ideas to test.


Take care:
It is possible to answer multiple questions with one test that has multiple recipes, but take care to avoid addressing the later questions without first answering the earlier ones. 
If you introduce your new feature in the middle of the page (without testing) and then start testing what the headline and copy should say, then you're testing in a blind alley, without understanding if you have the bets placement already.  And if your test recipes all lose, was it because you changed the headline from "Find your ideal sprocket" to "Select the widget that suits you", or was it because the feature simply doesn't belong on the page at all?

Also take care not to become bogged down in fine detail questions when you're still answering more general questions.  It's all too easy to become tangled up in discussions about whether the feature is black with white text, or white with black text, when you haven't even tested having the feature on the page.  The cosmetic questions around placement and appearance are far more interesting and exciting than the actual necessary aspects of getting the new element onto the page and making it work.  

For example, NASA recently landed another probe on Mars.  It wasn't easy, and I don't imagine there were many people at NASA who were quibbling about the colour of the parachute or the colour of the actual space rocket.  Most people were focused on actually getting the probe onto the martian surface.  The same general rule applies in A/B testing - sometimes just getting the new element working and present on the page generates enough difficulties and challenges, especially if it's a dynamic element that involves calling APIs or other third-party services.

In those situations, yes, there are design questions to answer, but 'best guess' is a perfectly acceptable answer.  What should it look like?  Use your judgement; use your experience; maybe even use previous test data, and come back to it in a later test.  


Testing InSight's parachute; Image: NASA
But don't go introducing additional complexity and more variables where they're really not welcome.  What colour was the NASA parachute?  The one that was easiest to produce.

Once your first test on presence has been completed, it becomes a case of optimizing any remaining details.  CTA button wording and color; smaller elements within the new feature; the 'colour of the parachute' and so on.  You'll find there's more interest in tweaking the design of a winner than there is in actually getting it working, but that's fine... just roll with it!