How Analysis Goes Wrong: The Week in Awful Analysis – Week #1
How Analysis goes wrong is a new weekly series focused on evaluating common forms of business analysis. All evaluation of the analysis is done with one goal in mind: Does the analysis present a solid case why spending resources in the manner recommended will generate additional revenue than any other action the company could take with the same resources. The goal here is not to knock down analytics, it is help highlight those that are unknowingly damaging the credibility of the rational use of data. All names and figures have been altered where appropriate to mask the “guilt”.
I thought it was time that I start breaking down some of the awful analysis that I hear on a daily and weekly basis. In all cases, I am choosing analysis that is presented with a straight face and that is believed to be amazing or justified by the person doing the analysis. There are many reasons for someone to present an analysis, justifying their job, giving a boss what he wants, making someone happy, boredom, ignorance. I can not speak for why someone thought this was a good idea, I can only evaluate their ability to make a meaningful business case.
This evaluation will not focus on those as the end objective, but will instead focus on measuring each form of analysis on the single objective measure of, “Does the analysis present a logical case for the use of resources in such a way that those resources will be used to generate additional revenue as opposed to any other use of those same resources.” This does not in any way say that the conclusion reached is wrong, simply that the data presented in no way confirms this conclusion.
To start, I wanted to walk through one of the most common types of analysis I hear from web analysts. There will obviously be more to the larger conversation then shared with the analysis presented, but in all cases I am tackling the common traits of the larger conversation, what is and is not presented. This exact case was shared with me as an example of the amazing power of analytics earlier this week (though numbers have been changed to protect the individual and organization involved):
Analysis: Checking your fallout report, we see that your registration page has an exit rate of 74%. The industry standard for those pages is 50%. This means that we should attempt to improve that page and if it were to meet industry standard, the change would be worth 3.4 million dollars per year.
Problems: There are a large number of assumptions that go into this type of analysis, so I want to hit them on a high level one by one:
1) There is a linear rate of outcome for people in the flow. I have 50 people that produce 200, so if I add an additional 50, then I will get 400. This ignores that you have no idea the behavior pattern of the people who weren’t spending money before, that you are magically getting to do so now. They are not like your other users, but you assume they will suddenly behave the exact same way.
2) You ignore the single most fundamental rule of optimization, you need three pieces of information to know the value of a change. You need to know population, ability to influence, and cost. It does no good to talk about a real world problem if it would take 10 times the amount of resources to achieve a goal.
3) It ignores the reality of finite resources, so that for that same cost to get that lift to that population what else could you do to with those resources and how have you given any measure that this is more valuable than any other action?
4) It assumes the issue is on that page, and not the prior one, or before, or the traffic itself that comes to the site.
5) It assumes that the changes you make to improve performance will not negatively impact (it actually assumes no impact) to the existing purchasers on the site.
6) It ignores the fact do you even have access to the page, resources to change it, and a political environment that will allow those changes? If the suggestion can’t be acted on, you just wasted everyone’s time.
7) It ignores the time it would take to get that change. You can magically give a revenue number and say for a year, and not know if it will take you 9 months or 2 weeks to hit that magical mark.
8) It gives an absolute revenue figure. You cannot give a perfect projected revenue figure, no matter how much you believe in your predicative measures. There is always a variance and error rate, and you have to assume that nothing in the world related to the site will change during that magical period of time.
I can keep going, but will hit many other points during future breakdowns. Fundamentally, this analysis fails because it assumes that an anomaly or pattern in correlative data tells you anything about the ability to change things based on the finite amount of resources that any group finds itself needing to leverage. Because of this, it focuses on some magically high dollar figure, but does not account for cost, alternatives, time, or where the “problem” actually is.
If there is an analysis that you would like to have reviewed, privately or publicly, you can send an email direct at email@example.com