How Analysis Goes Wrong: The Week in Awful Analysis – Week #7
How Analysis goes wrong is a new weekly series focused on evaluating common forms of business analysis. All evaluation of the analysis is done with one goal in mind: Does the analysis present a solid case why spending resources in the manner recommended will generate additional revenue than any other action the company could take with the same resources. The goal here is not to knock down analytics, it is help highlight those that are unknowingly damaging the credibility of the rational use of data. What you don’t do is often more important then what you do choose to do. All names and figures have been altered where appropriate to mask the “guilt”.
Sometimes the worst mistakes are those that we make when we are trying to impress others with our work. Nothing leads to less credibility then trying to make up numbers or use some form of flawed logic to make it look like our work is the only reason that the organization exists. Often times, these types of mistakes are the ones that we are least aware of. You will find that you can get away with this type of reporting in the short term, but as soon as someone tries to look behind the curtain groups are left without meaningful answers and often spend the rest of their time updating resumes instead of improving analysis. A perfect example of this comes from a very public source, and while I normally would not name the direct example, in this case it is such a well-known and “popular” speaker, that there is little need to masking identities. This week’s awful analysis comes from no other then Avinash and his blog Occam’s Razor.
Analysis – To show how much impact analytics has had, all you need to do is take the current revenue minus the past revenue, times it by the time and divide by the cost and you get the ROI of analytics.
It is often times a good thing for people in analytics to understand how marketing can take data and exploit it for personal gain, but it is something very different when we fall into the exact same traps. There is so much wrong here that I hardly know where to begin, but let’s look at the highest level problems:
1) It attributes 100% of revenue gain to analytics, which ignores the fact that data is sinusoidal, meaning that it goes up and down all the time with no direct interaction. The complexities of the entire marketplace make it nearly useless to use a pre/post type of analysis with any kind of accuracy.
2) It assumes that the same revenue and resources that were spent here would not have been spent elsewhere. The people in these departments are not stupid, and while they may have followed 100% of your suggestions, it doesn’t mean that they would not have gotten more by doing what they would have otherwise. If they would have generated 150% increase, and your suggestions generate 120% increase, you actually lost 30% of revenue, not gained 120%. The same can be said for looking at a test only for what won, and not the difference between what won and what would have won if the test had only been what the original idea was. The difference of the analysis in both cases is the expansion of opportunities, not the original opportunity itself.
3) It focuses on the suggestions, not on the accuracy of the actual work. Some of the worst people with data are analysts, as they have the most opportunity to abuse the data to push their agenda over someone else’s. Nothing is worse for the industry as a whole then analysts who do not understand the limitations of their own opinions and the need for rational uses of data, not stories and meaningless suggestions.
4) The entire point is someone trying to find revenue opportunities only from only passive interaction with data. There is no way to know the influence or cost of an action in passive data (correlative), so why then would you take any credit for additional revenue? This is the classic analyst fallacy of pretending that the case for action and the action to take are both available in the same data. Active (causal) data acquisition is the only way to get these pieces of information, yet we are trying to make a claim that requires this information without it. There is nothing worse for rational uses of data then this obvious an abuse of personal agenda by the analytics team itself.
I have seen this type of analysis done in many different ways over the years. In all cases, it is a clear sign that the person doing the analysis both clearly does not understand what the data they report means, but also that they are only using their position for personal gain and not organizational gain. I believe that data can add rationality and improve performance of organizations in magnitudes, but the only way to do that is for the analyst themselves to rationally use their owndata. It is important that people understand your impact to the bottom line, but there are many ways to do this that do not require false statements and personal agendas.
Analytics can be a powerful tool that can shape organizations, but it can also be a weapon used to push one person’s agenda versus another, with the result being no gain to the organization but more internal politics. We have so much talk about the power of data, and about the potential of big data, and while you can use it to predict things and build tools that leverage it, the reality is that until we have people running programs that are interested in the real impact to the business, that all the “promise” will be nothing but empty air. Just because you have analytics or just because you build a recommendation or a tool that uses data does not inherently mean it is providing value to anyone. Value is the additional growth of revenue in the most efficient way possible; it is not some flashy toy or your own personal agenda. If you want to really see data use expand throughout your organization, then stop abusing it yourself and help others see the real power of being able to explore and exploit information.