These days it seems that every marketer is focused on getting more data- and analytics-focused. The promise of digital has always been getting real insight into the true impact of our marketing investments on sales. But for most of us, that promise has been just that -- a promise. The notion is appealing, perhaps, but it's often been more of a future hope than a current objective.
The problem is that most of us are going about it wrong. Here are the top five mistakes people make as they try to take an analytics-based approach to marketing.
Most companies are underinvested in marketing analytics infrastructure. OK, that's not necessarily what a frugal marketer wants to hear. But think about it. You have tens of thousands -- perhaps millions -- of customers. That's potentially hundreds of data points per customer -- across time. And customer information is just the tip of the iceberg. You also need to know about the people you touched but didn't convert so you can know if what you did actually had an impact.
You can't analyze tens of millions of events in Excel. And, by the same token, you can't derive the true value of all this data from a software-as-a-service (SaaS) tool that took two hours to implement. It's millions -- possibly billions -- of marketing events! When someone contends that you can unlock the value of all this information quickly with an off-the-shelf tool, does that sound credible?
The sooner that we all accept that investing appropriately in analytics infrastructure is critical, the sooner we can actually know what works. Only then will we have real guidance on what to do instead of basing our budgets -- and careers -- on hunches, confirmation bias, and the like.
We all recognize that centrally integrating and managing data sets from different marketing platforms is desirable. But saying it and doing it are two vastly different things.
Different platforms collect different data points, and in different structures. You don't snap your fingers and get such data sets combined correctly and accurately. Also, virtually every data set has errors and formatting issues that need to be identified and addressed.
What makes this such a bear of a challenge is that the quantity of data we are collecting is staggering. Brands have increased their capacity to collect data far more rapidly than their ability to ensure that all the data are relevant and usable. Sifting through larger and larger quantities of imperfect data can actually make your conclusions more inaccurate. A little good data is a lot better than a whole lot of bad data.
Ignoring the complexity
Marketing attribution is a major mathematical challenge, requiring the specialized skill sets of statisticians. When you unite all of your marketing data, the sheer quantity of information is staggering. It's not something you can repurpose a marketing generalist or a finance person to deal with effectively.
Solving the issue by simply making the data "smaller" -- by only analyzing data attached to conversions -- is not an option. You need to, at a minimum, compare converters and non-converters in order to start understanding what worked and what didn't. Just because exposure to a particular marketing activity is common among conversions doesn't mean it's disproportionately associated with conversion. You can't even measure the correlation of that exposure to conversion without looking at non-conversions.
For virtually every brand, the number of non-converters is vastly larger than the number of converters, which makes the challenge of data management and analysis more difficult by a factor of 10 or 50 or 99 or 999. Once you start to evaluate both converters and non-converters, data can get out of hand pretty quickly. Thus, we wind up back at the need to invest appropriately and assign the right people to the right tasks.
Making the jump to causation and incremental value is even more precarious because we must be careful to avoid the post hoc ergo propter hoc fallacy -- which is Latin for "after this, therefore because of this." Understanding what causes something -- like a conversion -- is different from understanding what events correlated with it. It might be that exposure to a particular tactic is "chasing" the conversion, rather than causing it. The principles of causal inference are nuanced, and need to be handled with expertise.
This is not just "big data." It's big and complex data with equally complex questions attached. And that means you need big math and stats to deal with it.
You need actionability
Most marketing organizations are increasing the amount that they invest in analysis and analytics. But the majority of that work is focused on the past: Did we achieve what we said we would? Knowing "what worked" has importance, but to derive the true value of analytics, you need actionability. You need your learning connected to the marketing execution tools you use.
Additionally, if we accept that multiple touch points impact conversion, then that challenge of connecting those tactics to true measures of their ROI impact becomes more enormous still. There are millions of touch points, hundreds of thousands of different paths to conversion, millions more non-conversions, dozens of creative executions, and so on. You don't push a button and really get to the bottom of that sort of mathematical challenge. Software alone cannot do the work of experts.
Similarly, it challenges credulity to think that a pretty dashboard can deliver true analytics. The biggest issue with SaaS-based DIY solutions is that almost by definition they must focus on surrogate measures and incomplete math to make themselves easy to implement and use. Data visualization and clarity are possible -- actually, essential. But the unique data and situation of your brand can get lost when you use cookie-cutter platforms.
There are many great analysts in the world who aren't PhDs in statistics or mathematics. But analysis is different from analytics. And revealing the true business impact of marketing tactics -- back to correlation versus causality -- requires analytics, not reporting or analysis. I'm not being pedantic here. The words simply describe vastly different things.
I'm not anti-SaaS or anti-dashboard. A simple, actionable message is certainly important to getting things done "in the real world." That's something that is lost in the academic realm of statistics and math. But you don't provide genuine analytics with a dashboard, and sometimes complexity is required to get an accurate answer to a problem.
The key issue here is that understanding the impact of a tactic on ROI is difficult, especially for so-called branding campaigns. Thus, most brands substitute ROI impact analytics with surrogate measures -- clicks, Facebook "likes," Twitter follows -- to simplify the math. The value of such surrogate measures is, in most cases, unknown.
Has your brand figured out the precise business value of getting another Twitter follower? Or the value of a site visit for a product sold offline? Actionable analytics are focused on your true business goals, not surrogate measures whose value as business drivers is assumed rather than known.
Organizations lack political will
Something happens on the way to an analytics-based brand strategy. It's very tough to achieve, so we focus on delivering easy-to-measure metrics. Further, in many organizations, the focus drifts to verifying the relevance and validity of our current strategies, rather than identifying the best approaches for growth.
The challenge of analytics-driven marketing is that it requires commitment and alignment across an organization. There can be no sacred cows, no pet tactics. Or at least if there are sacred cows, we need to recognize that that's exactly what they are -- that there are reasons other than ROI maximization why we do them.
For some people reading this, a fact-based approach to tactic selection would seem a breath of fresh air. So many marketers I have spoken with have said that their efforts to optimize marketing are hindered by fad-driven decision making higher up. "We need a widget!" "We need an app!" "Why doesn't our toilet paper have a Twitter account?"
As a statistician, I am a purist. If I had my druthers, everything would be quantifiable and measurable. Not everything is quantifiable yet, but an organization aligned to the value of analytics will get closer and closer to that goal over time. An analytics-based approach to marketing strategy and allocation demands that we recognize that better information has given us new insight. Knowing and acting on that knowledge is a good thing. Knowing and not acting is irresponsible.
A pure analytical approach might suggest that some tactics are ineffective and need to be reconsidered, but this in no way questions the worth of individuals charged with implementing those tactics. They might, for example, have been incredibly adept at getting extraordinary results from a tactic even if it wasn't the best use of funds. For that, the individual deserves praise, not condemnation.
Most marketers say that their No. 1 goal is to figure out the exact best way to spend their budgets for maximum ROI. To do that, you need to:
- Stay focused on the ROI goal, not rely on possibly unconnected surrogate measures to assess performance.
- Commit to identifying the causes of conversion (versus correlations with conversion) so you can allocate your resources optimally.
- Invest in the people and tools to produce statistically sound insights.
- Action those insights to the greatest extent possible.
Many non-statisticians think that the mathematical techniques necessary to achieving genuine marketing attribution haven't yet been invented. That isn't true. The techniques are decades old. Rather, most brands simply lack the commitment to investing to get the answers and the political will to implement them.
On Twitter? Follow iMedia Connection at @iMediaTweet.