Header tag

Thursday, 28 August 2014

Telling a Story with Web Analytics Data

Management demands actionable insights - not just numbers, but KPIs, words, sentences and recommendations.  It's therefore essential that we, as web analysts and optimisers, are able to transform data into words - and better still, stories.  Consider a report with too much data and too little information - it reads like a science report, not a business readout:

Consider a report concerning four main characters;
Character A: female, aged 7 years old.  Approximately 1.3 metres tall.
Character B:  male, aged 5 years old.
Character C: female, aged 4 years old.
Character D:  male, aged 1 year old.

The main items in the report are a small cottage, a 1.2 kw electric cooker, 4 pints of water, 200 grams of dried cereal and a number of assorted iron and copper vessels, weighing 50-60 grams each.

After carrying out a combination of most of the water and dried cereal, and in conjunction with the largest of the copper vessels, Character B prepared a mixture which reached around 70 degrees Celsius.  He dispensed this unevenly into three of the smaller vessels in order to enable thermal equilibrium to be formed between the mixture and its surroundings.  Characters B, C and D then walked 1.25 miles in 30 minutes, averaging just over 4 km/h.  In the interim, Character A took some empirical measurements of the chemical mixture, finding Vessel 1 to still be at a temperature close to 60 degrees Celsius, Vessel 2 to be at 70 degrees Fahrenheit and Vessel 3 to be at 315 Kelvin, which she declares to be optimal.

The report continues with Character A consuming all of the mixture in Vessel 3, then single-handedly testing (in some case destruction testing) much of the furniture in the small cottage.

The problem is:  there's too much data and not enough information. 

The information is presented in various formats - lists, sentences and narrative.


Some of it the data is completely irrelevant (the height of Character A, for example)
Some of it is misleading (the ages of the other characters lacks context);
Some of it is presented in a mish-mash of units (temperatures are stated four times, with three different units).
The calculation of the speed of the walking characters is not clear - the distance is given in miles; the time is given in minutes; and the speed in kilometres per hour (if you are familiar with the abbreviation km/h).

Of course, this is an exaggeration, and as web analytics professionals, we wouldn't do this kind of thing in our reporting. 

Visitors are called visitors, and we consistently refer to them as visitors (and we ensure that this definition is understood among our readers)
Conversion rates are based on visitors, even though this may require extra calculation since our tools provide figures based on visits (or sessions)
Percentage of traffic coming from search is quoted as visitors (not called users), and not visits (whether you use visitors or visits is up to you, but be consistent)
Would you include number of users who use search?  And the conversion rate for users of search?
And when you say 'Conversion', are you consistenly talking about 'user added an item to cart', or 'user completed a purchase and saw the thank-you page'?
Are you talking about the most important metrics?
 
Too much data, not enough information?
So - make sure, for starters, that your units and data and KPIs are consistent, contextual, or at least make sense. And then:  add the words to the numbers.  It's only the start to say that: "We attracted 500 visitors with paid search, at a total cost of £1,200."  Go on to talk about the cost per visitor, break it down into key details by talking about the most expensive keywords, and the ones that drove the most traffic.  But then tell the story:  there's a sequence of events between user seeing your search term, clicking on your ad, visiting your site, and [hopefully] converting.  Break it down into chronological steps and tell the story!

There are various ways to ensure that you're telling the story; my favourites are to answer these types of questions:
"You say that metric X has increased by 5%.  Is that a lot?  Is that good?"
 "WHY has this metric gone up?"
"What happened to our key site performance indicators (profit, revenue, conversion) as a result?"
and my favourite
"What should we do about it?"

Character A
There are, of course, various ways to hide the story, or disguise results that are not good (i.e. do not meet sales or revenue targets) - I did this in my anecdote at the start. However, management tend to start looking at incomplete data, or data that's obscure or irrelevant, and go on to ask about the data that's "missing"... so the truth will out, so it's better to show the data, tell the whole story, and highlight why things are below par. 

It's our role to highlight when performance is down - we should be presenting the issues (nobody else has the tools to do so) and then going on to explain what needs to be done - this is where actionable insights become invaluable.  In the end, we present the results and the recommendations and then let the management make the decision - I blogged about this some time ago - web analytics: who holds the steering wheel?

In the case of Characters A, B, C and D, I suggest that Characters B and C buy a microwave oven, and improve their security to prevent Character A from breaking into their house and stealing their breakfast.  In the case of your site, you'll need to use the data to tell the story.

Thursday, 14 August 2014

I am a power-tool A/B skeptic

I have recently enjoyed reading Peter W Szabo's article entitled, "I am an A/B testing skeptical."  Sure, it's a controversial title (especially when he shared it in the LinkedIn Group for web analysts and optimisers), but it's thought-provoking nonetheless.

And reading it has made me realise:  I am a power-drill skeptic. I've often wondered what the benefit of having the latest Black and Decker power tool might actually be.  After all, there are plenty of hand drills out there that are capable of drilling holes in wood, brick (if you're careful) and even metal sheet. The way I see it, there are five key reasons why power drills are not as good as hand-drilling (and I'm not going to discuss the safety risks of holding a high-powered electrical device in your hand, or the risks of flying dust and debris).


5.  There's no consistency in the size of hole I drill.

I can use a hand drill and by watching how hard I press and how quickly I turn the handle, I can monitor the depth and width of the hole I'm drilling.  Not so with a power drill - sometimes it flies off by itself, sometimes it drills too slowly.  I have read about this online, and I've watched some YouTube videos.  I have seen some experienced users (or professionals, or gurus, or power users) drill a hole which is 0.25 ins diameter and 3 ins deep, but when I try to use the same equipment at home, I find that my hole is much wider (especially at the end) and often deeper.  Perhaps I'm drilling into wood and they're drilling into brick? Perhaps I'm not using the same metal bits in my power drill?  Who knows?



4.  Power drill bits wear out faster.

Again, in my experience, the drill bits I use wear out more quickly with a power drill.  Perhaps leaving them on the side isn't the best place for them, especially in a damp environment.  I have found that my hand drill works fine because I keep it in my toolbox and take care of it, but having several drill bits for my power tool means I don't have space or time to keep track of them all; what happens is that I often try to drill with a power-drill bit that's worn out and a little bit rusty, and the results aren't as good as when the drill bits were new.  The drill bits I buy at Easter are always worn out and rusty by Christmas.

The professionals always seem to be using shiny new tools and bits, but not me.  But, as I said, this hasn't been a problem previously because having one hand-drill with only a small selection of bits has made it easier to keep track of them.  That's a key reason why power tools aren't for me. 


3.  Most power drills are a waste of time.

Power drills are expensive, especially when compared to the hand tool version.  They cost a lot of money, and what's the most you can do with them?  Drill holes.  Or, with careful use, screw in screws.  No, they can't measure how deep the hole should be, or how wide.  Some models claim to be able to tell you how deep your hole is while you're drilling it, but that's still pretty limited.  When I want to put up a shelf, I end up with a load of holes in a wall that I don't want, but that's possibly because I didn't think about the size of the shelf, the height I wanted it or what size of plugs I need to put into the wall to get my shelf to stay up (and remain horizontal).  Maybe I should have measured the wall better first, or something.
Measure twice, drill once.
2.  I always need more holes 

As I mentioned with power drills being a waste of time, I often find that compared to the professionals I have to drill a lot more holes than usual.  They seem to have this uncanny ability to drill the holes in exactly the right places (how do they do that?) and then put their bookshelves up perfectly.  They seem to understand the tools they're using - the drill, the bits, the screws, the plugs, the wall - and yet when I try to do this with one of their new-fangled power-drills, I end up with too many holes.  I keep missing what I'm aiming for; perhaps I need more practice.  As it is, when I've finished one hole, I can often see how I could make it better and what I need to do, and get closer and closer with each of the subsequent holes I drill.  Perhaps the drill is just defective?

1.  Power drills will give you holes, but they won't necessarily be the right size

 
This pretty much sums up power drills for me, and the largest flaw that's totally inherent in power tools.  I've already said that they're only useful for drilling holes, and that the holes are often too wide, too short and in the wrong place. In some cases, when one of my team has identified that the holes are in the wrong place, they've been able to quickly suggest a better location - only to then find that that's also incorrect, and then have two wrong holes and still no way of completing my job.  It seems to me that drilling holes and putting up bookshelves (or display shelving, worse still) is something that's just best left to the professionals.  They can afford the best drill bits and the most expensive drills, and they also have the money available to make so many mistakes - it's clear to me that they must have some form of Jedi mind power, or foreknowledge of the kinds of holes they want to drill and where to drill them. 


In conclusion:

Okay, you got me, perhaps I am being a little unkind.  There are a lot of web analytics and A/B professionals out there, but there is also a large number of amateurs who want to try their hand at online testing and who get upset or confused when it doesn't work out.  Like any skilled profession, if you want to do analytics and optimisation properly, you can be sure it's harder than it looks (especially if it looks easy).  It takes time and thought to run a good test (let alone build up a testing program) and to make sure that you're hitting the target you're aiming for.  Why are you running this test?  It takes more than just the ability to split traffic between two or more designs to run a test.

Yes, I've parodied Peter W Szabo's original article, but that's because it seemed to me the easiest way to highlight some of the misconceptions that he's identified, and which exist in the wider online optimisation community - especially the ideas that 'tests will teach you useful things', and the underlying misconception that 'testing is quick and easy'.  I will briefly mention that you need a reason to run a test (just as you need a reason to drill a hole) and you need to do some analytical thinking (using other tools, not just testing tools) in the same way as you would use a spirit level, a pencil and a ruler when drilling a hole.

Drilling the hole in the wall is only one step in the process of putting up a bookshelf; splitting traffic in a test should be just one step in the optimisation process, and should be preceded by some serious thought and design work, and followed up with careful review and analysis.  Otherwise, you'll never put your shelf up straight, and your tests will never tell you anything.