Every time we open a spreadsheet, or start tapping a calculator (yes, I still do), or plot a graph, we start analysing data. As analysts, it is probably most of what we do all day. It's not necessarily difficult - we just need to know which data points to analyse, which metrics we divide by each other (do you count exit rate per page view, or per visit?) and we then churn out columns and columns of spreadsheet data. As online or website analysts, we plot the trends over time, or we compare pages A, B and C, and we write the result (so we do some reporting at the end as well).
As business analysts, it's not even like we have complicated formulae for our metrics - we typically divide X by Y to give Z, expressed to two decimal places, or possibly as a percentage. We're not calculating acceleration due to gravity by measuring the period of a pendulum (although it can be done), with square roots, fractions, and square roots of fractions.
Analysis - dare I say it - is easy.
What follows is the interpretation of the data, and this can be a potential minefield, especially when you're presenting to stakeholders. If analysis is easy, then sometimes interpretation can really be difficult.
For example, let's suppose revenue per visit went up by 3.75% in the last month. This is almost certainly a good thing - unless it went up by 4% in the previous month, and 5% in the same month last year. And what about the other metrics that we track? Just because revenue per visit went up, there are other metrics to consider as well. In fact, in the world of online analysis, we have so many metrics that it's scary - and so accurate interpretation becomes even more important.
Okay, so the average-time-spent-on-page went up by 30 seconds (up from 50 seconds to 1 minute 20). Is that good? Is that a lot? Well, more people scrolled further down the page (is that a good thing - is it content consumption or is it people getting well and truly lost trying to find the 'Next page' button?) and the exit rate went down.
Are people going back and forth trying to find something you're unintentionally hiding? Or are they happily consuming your content and reading multiple pages of product blurb (or news articles, or whatever)? Are you facilitating multiple page consumption (page views per visit is up), or are you sending your website visitors on an online wild goose chase (page views per visit is up)? Whichever metrics you look at, there's almost always a negative and positive interpretation that you can introduce.
This comes back, in part, to the article I wrote last month - sometimes two KPIs is one too many. It's unlikely that everything on your site will improve during a test. If it does, pat yourself on the back, learn and make it even better! But sometimes - usually - there will be a slight tension between metrics that "improved" (revenue went up), metrics that "worsened" (bounce rate went up) and metrics that are just open to anybody's interpretation (time on page; scroll rate; pages viewed per visit; usage of search; the list goes on). In these situations, the metrics which are open to interpretation need to be viewed together, so that they tell the same story, viewed from the perspective of the main KPIs. For example, if your overall revenue figures went down, while time on page went up, and scroll rate went up, then you would propose a causal relationship between the page-level metrics and the revenue data: people had to search harder for the content, but many couldn't find it so gave up.
On the other hand, if your overall revenue figures went up, and time on page increased and exit rate increased (for example), then you would conclude that a smaller group of people were spending more time on the page, consuming content and then completing their purchase - so the increased time on page is a good thing, although the exit rate needs to be remedied in some way. The interpretation of the page level data has to be in the light of the overall picture - or certainly with reference to multiple data points.
I've discussed average time on page before. A note that I will have to expand on sometime: we can't track time on page for people who exit the page. It's just not possible with standard tags.
So: analysis is easy, but interpretation is hard and is open to subjective viewpoints. Our task as experienced, professional analysts is to make sure that our interpretation is in line with the analysis, and is as close to all the data points as possible, so that we tell the right story.
Analysis. Apparently. |
As business analysts, it's not even like we have complicated formulae for our metrics - we typically divide X by Y to give Z, expressed to two decimal places, or possibly as a percentage. We're not calculating acceleration due to gravity by measuring the period of a pendulum (although it can be done), with square roots, fractions, and square roots of fractions.
Analysis - dare I say it - is easy.
What follows is the interpretation of the data, and this can be a potential minefield, especially when you're presenting to stakeholders. If analysis is easy, then sometimes interpretation can really be difficult.
For example, let's suppose revenue per visit went up by 3.75% in the last month. This is almost certainly a good thing - unless it went up by 4% in the previous month, and 5% in the same month last year. And what about the other metrics that we track? Just because revenue per visit went up, there are other metrics to consider as well. In fact, in the world of online analysis, we have so many metrics that it's scary - and so accurate interpretation becomes even more important.
Okay, so the average-time-spent-on-page went up by 30 seconds (up from 50 seconds to 1 minute 20). Is that good? Is that a lot? Well, more people scrolled further down the page (is that a good thing - is it content consumption or is it people getting well and truly lost trying to find the 'Next page' button?) and the exit rate went down.
Are people going back and forth trying to find something you're unintentionally hiding? Or are they happily consuming your content and reading multiple pages of product blurb (or news articles, or whatever)? Are you facilitating multiple page consumption (page views per visit is up), or are you sending your website visitors on an online wild goose chase (page views per visit is up)? Whichever metrics you look at, there's almost always a negative and positive interpretation that you can introduce.
This comes back, in part, to the article I wrote last month - sometimes two KPIs is one too many. It's unlikely that everything on your site will improve during a test. If it does, pat yourself on the back, learn and make it even better! But sometimes - usually - there will be a slight tension between metrics that "improved" (revenue went up), metrics that "worsened" (bounce rate went up) and metrics that are just open to anybody's interpretation (time on page; scroll rate; pages viewed per visit; usage of search; the list goes on). In these situations, the metrics which are open to interpretation need to be viewed together, so that they tell the same story, viewed from the perspective of the main KPIs. For example, if your overall revenue figures went down, while time on page went up, and scroll rate went up, then you would propose a causal relationship between the page-level metrics and the revenue data: people had to search harder for the content, but many couldn't find it so gave up.
On the other hand, if your overall revenue figures went up, and time on page increased and exit rate increased (for example), then you would conclude that a smaller group of people were spending more time on the page, consuming content and then completing their purchase - so the increased time on page is a good thing, although the exit rate needs to be remedied in some way. The interpretation of the page level data has to be in the light of the overall picture - or certainly with reference to multiple data points.
I've discussed average time on page before. A note that I will have to expand on sometime: we can't track time on page for people who exit the page. It's just not possible with standard tags.
So: analysis is easy, but interpretation is hard and is open to subjective viewpoints. Our task as experienced, professional analysts is to make sure that our interpretation is in line with the analysis, and is as close to all the data points as possible, so that we tell the right story.
No comments:
Post a Comment