uyhjjddddddddddd Web Optimisation, Maths and Puzzles: insight

Header tag

Showing posts with label insight. Show all posts
Showing posts with label insight. Show all posts

Sunday, 24 November 2024

Testing versus Implementing - why not just switch it on?

"Why can't we just make a change and see what happens? Why do we have to build an A/B test - it takes too long!  We have a roadmap, a pipeline and a backlog, and we haven't got time."

It's not always easy to articulate why testing is important - especially if your company is making small, iterative, data-backed changes to the site and your tests consistently win (or, worse still, go flat).  The IT team is testing carefully and cautiously, but the time taken to build the test and run it is slowing down everybody's pipelines.  You work with the IT team to build the test (which takes time), it runs (which takes even more time), you analyze the test (why?) and you show that their good idea was indeed a good idea.  Who knew?


Ask an AI what a global IT roadmap looks like...

However, if your IT team is building and deploying something to your website - a new way of identifying a user's delivery address; or a new way of helping users decide which sparkplugs or ink cartridges or running shoes they need - something new, innovative and very different, then I would strongly recommend that you test it with them, even if there is strong evidence for its effectiveness.  Yes, they have carried out user-testing and it's done well.  Yes, their panel loved it.  Even the Head of Global Synergies liked it, and she's a tough one to impress.  Their top designers have spent months in collaboration with the project manager, and their developers have gone through the agile process so many times that they're as flexible as ballet dancers.  They've barely reached the deadline for pre-Christmas implementation, and now is the time to implement it.  It is ready.  However, the Global Integration Leader has said that they must test before they launch, but that's okay as they have allocated just enough time for a pre-launch A/B test, then they'll go live as soon as the test is complete.


Sarah Harries, Head of Global Synergies

Everything hinges on the test launching on time, which it does.  Everybody in the IT team is very excited to see how users engage with the new sparkplug selection tool and - more importantly for everybody else - how much it adds to overall revenue.  (For more on this, remember that clicks aren't really KPIs). 

But the test results come back: you have to report that the test recipe is underperforming at a rate of 6.3% conversion drop.  Engagement looks healthy at 11.7%, but those users are dragging down overall performance.  The page exit rate is lower, but fewer users are going through checkout and completing a purchase.  Even after two full weeks, the data is looking negative.  

Can you really recommend implementing the new feature?  No; but that's not the end of the story.  It's your job to now unpick the data, and turn analysis into insights:  why didn't it win?!

The IT team, understandably, want to implement.  After all, they've spent months building this new selector and the pre-launch data was all positive.  The Head of Global Synergies is asking them why it isn't on the site yet.  Their timeline allowed three weeks for testing and you've spent three weeks testing.  Their unspoken assumption was that testing was a validation of the new design, not a step that might turn out to be a roadblock, and they had not anticipated any need for post-test changes.  It was challenging enough to fit in the test, and besides, the request was to test it.

It's time to interrogate the data.

Moreover, they have identified some positive data points:

*  Engagement is an impressive 11.7%.  Therefore, users love it.
*  The page exit rate is lower, so more people are moving forwards.  That's all that matters for this page:  get users to move forwards towards checkout.
*  The drop in conversion is coming from the pages in the checkout process.  That can't be related to the test, which is in the selector pages.  It must be a checkout problem.

They question the accuracy of the test data, which contradicts all their other data.

* The sample size is too small.
* The test was switched off before it had a chance to recover its 6.3% drop in conversion

They suggest that the whole A/B testing methodology is inaccurate.

* A/B testing is outdated and unreliable.  
* The split between the two groups wasn't 50-50.  There are 2.2% more visitors in A than B.

Maybe they'll comment that the data wasn't analyzed or segmented correctly, and they make some points about this:

* The test data includes users buying other items with their sparkplugs.  These should be filtered out.
* The test data must have included users who didn't see the test experience.
* The data shows that users who browsed on mobile phones only performed at -5.8% on conversion, so they're doing better than desktop users.

Remember:  none of this is personal.  You are, despite your best efforts, criticising a project that they've spent weeks or even months polishing and producing.  Nobody until this point has criticised their work, and in fact everybody has said how good it is.  It's not your fault, your job is to present the data and to provide insights based on it.  As a testing professional, your job is to run and analyse tests, not to be swayed into showing the data in a particular way.

They ran the test at the request of the Global Integration Leader, and burnt three weeks  waiting for the test to complete.  The deadline for implementing the new sparkplug selector is Tuesday, and they can't stop the whole IT roadmap (which is dependent on this first deployment) just because one test showed some negative data.  They would have preferred not to test it at all, but it remains your responsibility to share the test data with other stakeholders in the business, marketing and merchandizing teams, who have a vested interest in the site's financial performance.  It's not easy, but it's still part of your role to present the unbiased, impartial data that makes up your test analysis, along with the data-driven recommendations for improvements.

It's not your responsibility to make the go/no-go decision, but it is up to you to ensure that the relevant stakeholders and decision-makers have the full data set in front of them when they make the decision.  They may choose to implement the new feature anyway, taking into account that it will need to be fixed with follow-up changes and tweaks once it's gone live.  It's a healthy compromise, providing that they can pull two developers and a designer away from the next item on their roadmap to do retrospective fixes on the new selector.  
Alternatively, they may postpone the deployment and use your test data to address the conversion drops that you've shared.  How are the conversion drop and the engagement data connected?  Is the selector providing valid and accurate recommendations to users?  Does the data show that they enter their car colour and their driving style, but then go to the search function when they reach a question about their engine size?  Is the sequence of questions optimal?  Make sure that you can present these kinds of recommendations - it shows the value of testing, as your stakeholders would not be able to identify these insights from an immediate implementation.

So - why not just switch it on?  Here are four good reasons to share with your stakeholders:

* Test data will give you a comparison of whole-site behaviour - not just 'how many people engaged with the new feature?' but also 'what happens to those people who clicked?' and 'how do they compare with users who don't have the feature?'
* Testing will also tell you about  the financial impact of the new feature (good for return-on-investment calculations, which are tricky with seasonality and other factors to consider)
*  Testing has the key benefit that you can switch it off - at short notice, and at any time.  If the data shows that the test recipe is badly losing money then you identify this, and after a discussion with any key stakeholders, you can pull the plug within minutes.  And you can end the test at any time - you don't have to wait until the next IT deployment window to undeploy the new feature. 
* Testing will give you useful data quickly - within days you'll see how it's performing; within weeks you'll have a clear picture.




Thursday, 20 October 2016

English Premier League: Which Season Ticket is the Best Value?

In my two previous posts, I've examined the data for the English Premier League for the last ten seasons, reviewing how 'exciting' each season has been.  I've drawn some conclusions, segmented the data and found some interesting data points, but not yet produced anything that's really useful, or that can help a football fan.

It's time to move on, and to provide some useful facts and figures that are more meaningful and more useful than I've written previously - in particular, to look at the relative value and cost of season tickets for each of the teams.  But first, a quick recap:

Post number 1: Less than 10% of English Premier League games are goal-less (0-0) draws.
Post number 2:  Arsenal consistently achieve more goals per game (scored plus conceded) than average, while Everton frequently have fewer goals per game than average.

All very interesting and fascinating and useful to quote, but not really anything you can do anything with.  So far, the best recommendation I could make is: "If you were given the choice between watching an Arsenal game or an Everton game, I'd recommend the Arsenal game."

What I propose to do next is to start connecting the data I have to some additional data that will help form recommendations - in this case (and in most cases in business), money.  Money, in the form of reduced costs or increased sales and revenue, is often the essential part of any business recommendation, and I can apply the same process here.  We know how many goals per game (on average) we will see for each team in the English Premier League, but what we haven't yet identified is how much it would cost to see each game, and how much it will cost per goal.

In order to calculate this, I've taken the data from 2015-16 (the most recent completed season) and looked at the costs of season tickets, using the Sky Sports website for the costs.  I'm using the cheapest standard adult season ticket cost in each case.
Image credit

Jumping straight into the analysis - let's compare the cost of a season ticket to the average number of goals per game for the 2015-16 season:



And then compare the season tickets on a "cost per goal" basis, again for the 2015-16 season:



Isn't it interesting how the data has become more relevant, meaningful and even actionable when you start introducing money?

Arsenal may usually have the largest number of goals per season (or per game), and consistently achieve over-average performance there, but if you want to watch 'exciting' football of their type, you're going to have to pay for it.  (Note that the 2015-16 season was lower than usual for Arsenal, who actually came in below average for goals per game).

If you want the best value for your season ticket, then Man City is the place to go, at just £2.67 per goal - and you'll see plenty of goals too, 

This data could be displayed geographically (are London clubs better value than other regions?) or sorted in various other ways.  Beware, though, while you do this, of introducing apparent trends in your data when there is none:




This one isn't too bad, although it does look like season tickets are coming down in price.




This second one, though, makes it appear that (1) there is a trend, and (2) season ticket prices are going up (which is generally the case).

In Summary

In this series, I've moved from data to analysis to insight:

Post number 1: Less than 10% of English Premier League games are goal-less (0-0) draws. Data, and analysis

Post number 2:  Arsenal consistently achieve more goals per game (scored plus conceded) than average, while Everton frequently have fewer goals per game than average.  Analysis, but still nothing actionable.


Post number 3 (this post):  Arsenal may have the most goals on average, but in 2015-16 the cost of seeing a goal (£10.25) was much higher than the other clubs: 20% higher than the next-highest (Southampton, £8.53) and nearly four times higher than seeing a goal at Man City (£2.67, actually 3.83 times more).

Recommendations:
If you have the choice of watching an Everton match or an Arsenal match as a neutral, pick the Arsenal match.

Buy a season ticket for Many City, Villa, or West Brom.  If you want to follow a London club, the best value season ticket for London was Chelsea at £4.77 per goal, still half the price of an Arsenal ticket.  Actionable analysis.


Review
In a future post, I'll look at this worked example, pulling apart the differences between data, analysis, actionable analysis and insight


Wednesday, 12 October 2016

How exciting is the English Premier League?

So, it's the start of the English Premier League (EPL) season. Sport generates vast amounts of data, all available for analysis and insight, and in this post (and probably a couple of following posts), I will be looking at the English Premier League (football, aka soccer) for recent years and reviewing how the game has changed. This will form a practical look at data, reporting, analysis, insight and actionable analysis.

This is a reconstructed post: I originally posted this in September but the post has since been deleted or lost.  Here's what I can remember of it.


There are a number of questions to be asked (and answered):


How 'exciting' is the English Premier League?

How many goals can you expect to see per game?
How many games end in goal-less draws?
How many games are won by a one-goal margin (perhaps a good definition of a tense, exciting game).

This data can then be used to compare the English Premier League with other leagues (in the UK and abroad).

So, to start with, what's the average number of goals per game (total scored by both teams) for each of the last eleven seasons.

And the answer is:

And how does this compare with the percentage of games that are dull, uninteresting, goal-less draws?


The line graph above shows the percentage of goal-less draws.  It doesn't exactly trend with the average number of goals per game, but when the percentage of goalless draws is high (2008-2009) then the average goals per game is low (less than 2.5).

This does lead to an interesting point that would make marketers and headline-writers happy: "Less than 10% of EPL games end in goalless draws" (excluding 2008-2009).

Now we can see that 2006-2007 had the lowest average number of goals per game, while 2011-2 had the highest; we can then analyse these two seasons side by side - see below - to understand where the differences were.

Key points:
- 2007 had 34 0-0 draws, compared to 27 for 2012.  Only 2008-9 had fewer (25).
- 2011-2 had more games with five, six, seven, eight and ten goals.  
- The highest scoring game in 2006-7 was Arsenal 6 - Blackburn 2.  
- In 2011-12, the highest scoring game was Man United 8 - Arsenal 2.

Finally, which seasons were most interesting from the perspective of one-goal winners?  Not just 1-0, but 2-1, 3-2, 4-3 and so on.   
2011-12, with its huge average number of goals per game, doesn't do so well here.  2006-7 and 2007-9, the two games with low goals per game and high percentage of goalless draws, does marginally better - they were both really mean seasons.

Football data obtained from this football website; others are available.

--

Summary

Analysing the data at this level - with trended comparisons - has given us the ability to compare one time period with another.  There's nothing actionable here, but we get a nice headline about the percentage of 0-0 draws.  In the next post I wrote (chronologically, before the original version of this post was lost), I segmented the data by team, and that provided more interesting insights.

Other articles I've written looking at data and football

Checkout Conversion:  A Penalty Shootout
When should you switch off an A/B test?
The Importance of Being Earnest with your KPIs
Should Chelsea sack Jose Mourinho? (It was a relevant question at the time, and I looked at what the data said)
How Exciting is the English Premier League?  what does the data say about goals per game?

Friday, 23 September 2016

Premier League Excitement - Further Analysis

In my last post I looked at 'How exciting is the Premier League' and produced the interesting data point that less than 10% of Premier League games are goal-less.  This may be interesting, and it might even count as insight, but it's not very actionable.  We can't do anything with it, or make any decisions from it.  I suppose the question is, "Is that a lot?" and I'll be looking at that question in more detail in future.

So, my next step is to look at how the different teams in the Premier League compare on some of the key metrics that I discussed - goals per game (total conceded plus scored), percentage of goalless games and so on.

Number of goals per game (conceded plus scored)

Firstly, I segmented the data per team:  how many goals were there per game for each team in the Premier League.  This is time-consuming, but worthwhile, and a sample of the data is shown below.  I have data as far back as the 2004-5 season, but the width wouldn't fit on this page: 
Club
Y2010
Y2011
Y2012
Y2013
Y2014
Y2015
Y2016
Arsenal
        2.58
        3.03
        3.24
        2.87
        2.87
        2.82
        2.66
Aston Villa
        2.21
        2.82
        2.37
        3.05
        2.63
        2.32
        2.71
Birmingham

        2.50





Blackburn
        2.79
        2.76
        3.32




Bolton
        2.61
        2.84
        3.24




Charlton
        2.47






Chelsea
        2.32
        2.68
        2.92
        3.00
        2.58
        2.76
        2.95
Crystal Palace




        2.13
        2.58
        2.37
Everton
        2.32
        2.53
        2.37
        2.50
        2.63
        2.58
        3.00
Fulham
        2.58
        2.42
        2.61
        2.89
        3.29


Liverpool
        2.21
        2.71
        2.29
        3.00
        3.97
        2.63
        2.97
Man City
        1.92
        2.45
        3.21
        2.63
        3.66
        3.18
        2.95
Man United
        2.89
        3.03
        3.21
        3.39
        2.82
        2.61
        2.21
Middlesbrough
        2.45






Newcastle
        2.24
        2.97
        2.82
        2.97
        2.68
        2.71
        2.87
Norwich


        3.11
        2.61
        2.37

        2.79
Portsmouth
        2.29






Southampton



        2.87
        2.63
        2.29
        2.63
Tottenham
        2.92
        2.66
        2.82
        2.95
        2.79
        2.92
        2.74
West Brom

        3.34
        2.55
        2.89
        2.68
        2.34
        2.16
Wigan
        2.53
        2.66
        2.74
        3.16



Season Average
2.77
2.80
2.81
2.80
2.77
2.57
2.70

Blank columns indicate a season where a team was not in the Premier League.  
Bold figures show where a team achieved over 3 goals per game for the season.
Y2008 indicates the season 2007-2008.
Firstly:  sorting alphabetically makes sense from a listing perspective, but for comparison the data is best sorted numerically (from highest to lowest). 

Secondly:  There's a lot of data here, and clearly a visualisation is needed:  I'm going with a line graph.  And to avoid spaghetti, I'm going to highlight some of the key teams - the team with the highest average number of goals per game; the team with the lowest, and the average.

Thirdly:  to identify the overall highest- and lowest-goal teams, I'm just going to take the totals of the averages for the last nine seasons, and sort them from the list.  Teams that were not in the Premier League for one or more seasons are included based on their performance while they were in the Premier League.

Premier League Teams:  Average number of goals per game over the last 12 seasons:

Club
Average
Arsenal
      2.842
Tottenham
      2.833
Man City
      2.825
Blackburn
      2.816
Man United
      2.807
Liverpool
      2.781
Newcastle
      2.751
Norwich
      2.717
Bolton
      2.705
Overall Average
      2.702
Birmingham
      2.671
Chelsea
      2.670
West Brom
      2.669
Aston Villa
      2.667
Fulham
      2.613
Southampton
      2.605
Wigan
      2.566
Everton
      2.518
Charlton
      2.474
Middlesbro
      2.404
Portsmth
      2.368
Crystal Palace
      2.360

Key takeaways:  
- Arsenal have had the most total goals per game over the last nine seasons (2.842 goals per game)
- Everton have the lowest average number of goals per game for teams which have been present in all 12 seasons (2.518 goals per game).
- Put another way:  Arsenal fans have seen 1296 league goals in the last 12 seasons, compared to 1148 for Everton fans (148 fewer).


Theo Walcott, celebrating during Arsenal's win over Hull, Sept 2016  Image credit

Time for some graphs!

Firstly, average goals per season, for the last 12 seasons, for Arsenal, Everton, the league average, Liverpool (who achieved an average of 3.97 in 2013-14) and Man United (because they're always worth comparing).



This shows clearly that Arsenal (green line) have consistently exceed the league average, falling below it only twice in the last 12 seasons.  Everton (blue) have only once exceeded the average, and that was in the most recent season.  Liverpool have exceeded the average over the last four seasons, but prior to that were consistently below (and similar to Everton).

Connecting this to 'real life' events:

- Everton moving from David Moyes to Roberton Martinez in August 2013 did not make any difference to their 'excitement' factor until the 2015-16 season.

- Arsenal, and Arsene Wenger, could not be called 'boring' based on their goals per game. 

- Brendan Rogers had an interesting time at Liverpool, when they hit the highest goals-per-game for the season for any club in the last 12 years (3.97).  Note that this does not discriminate between goals scored or conceded.

Secondly, adjusting the data to show the difference between each team and the overall average (so that the data shows a delta versus the average).



To give you an indication of Liverpool's remarkable 2013-4 season:  their games had more than one goal per game more than the season average.  Brendan Rogers had an eventful time at Liverpool.

Fulham also had an 'exciting' season in 2013-4, achieving 3.29 goals per game (average was 2.77) - but were subsequently relegated.

In summary:

- Arsenal have had the highest average goals per game over the last nine seasons (2.842 goals per game), while Everton have the lowest, at 2.518 goals per game.
- Arsenal have exceeded the league average goals per game in 10 out of the last 12 seasons, and have the highest average overall.
- Man United have achieved above-average goals per game in nine of the last 12 seasons; however the 2015-16 season was the least 'exciting' they've recorded in that period.

Review

Segmenting the data by team is proving more useful.  It's now possible to make predictions about the 2016-17 season:

- Arsenal to remain most 'exciting', closely followed by Tottenham and Man City.
- Everton to remain the least 'exciting', with 1-1, 2-1 and 2-0 results dominating.
- Man United are extremely unpredictable, especially as they have a new manager this season (although nobody could have predicted the dreadful start they've made to the current season).

The raw data used in this analysis is available from the football data website, among others.

More articles on data analysis in football:

Reviewing Manchester United's Performance
Should Chelsea Sack Jose Mourinho? (it was relevant at the time I wrote it)
How exciting is the English Premier League?  (quantifying a qualitative metric)
The Rollarama World Football Dice Game (a study in probability)