Case studies. Every testing tool provider has them - in fact, most
sales people have them - let's not limit this to just online
optimisation. Any good sales team will harness the power of persuasion
of a good case study: "Look at what our customers achieved by using our
product." Whether it's skin care products, shampoo, new computer
hardware, or whatever it may be. But for some reason, the online
testing community really, really seems to enthuse about case studies in a
way I've not seen anywhere else.
And here's the link they refer to.
Quoting the headlines from that site, there are five problems with A/B testing case studies:
- What may work for one brand may not work for another.
- The quality of the tests varies.
- The impact is not necessarily sustainable over time.
- False assumptions and misinterpretation of result.
- Success bias: The experiments that do not work well usually do not get published.
I came to the worrying suspicion that people (and maybe Qualaroo) read A/B testing case studies, and then implement the featured test win on their own site with no further thought. No thought about if the test win applies to their customers and their website, or even if the test was valid. Maybe it's just me (and it really could be just me), but when I read A/B testing case studies, I don't immediately think, 'Let's implement that on our site'. My first thought is, 'Shall we test that on our site too?'.
And yes, there is success bias. That's the whole point of case studies, isn't it? "Look at the potential you could achieve with our testing tool," is significantly more compelling than, "Use our testing tool and see if you can get flat results after eight weeks' of development and testing". I expect to see eye-grabbing headlines, and I anticipate having to trawl through the blurb and the sales copy to get to the test design, the screenshots and possibly some mention of actual results.
So let's stick with A/B tests. Let's not be blind to the possibility that our competitors' sites run differently from ours, attract different customers and have different opportunities to improve. Read the case studies, be skeptical, or discerning, and if the test design seems interesting, construct your own test on your own site that will satisfy your own criteria for calling a win - and keep on optimising.
No comments:
Post a Comment