Your measurement department has almost certainly discovered the current religion in digital measurement: Don't overload on numbers. The key to successful dashboarding and reporting is finding a small set of sitewide KPIs that are understandable and immediately actionable.
Chances are, that's exactly what your enterprise has adopted -- a small set of key metrics like site conversion rate, visits trend, site satisfaction, etc. -- all laid out in big numbers with great fonts, pretty colors, big trend arrows, and lots of Tufte-inspired whitespace.
Unfortunately, these reports deliver neither understanding nor actionability.
Suppose the online marketing director walks into your office to tell you that your site conversion rate is up 5 percent. You'll probably be delighted. Next comes your SEO director to tell you that your site search engine traffic is down 20 percent. That's bad, right?
But would you realize that in all probability the two stories are related? As you drive less early-stage traffic to your site via natural search, your conversion rate will go up.
Reports built on KPIs are like having 10 different managers telling you 10 different, entirely unrelated stories and leaving you to figure out what they mean when you put them together. That might be life, but it doesn't have to be reporting.
Understanding the system -- the interrelationship between parts -- is fundamentally different and more important than understanding the state of any single volumetric.
It turns out that a change in any single variable in a complex system can always be explained in a variety of ways -- some of which would be interpreted as positive and some as negative. It doesn't matter if the metric is site satisfaction, revenue, conversion rate, or traffic. The possibility of multiple explanations makes it impossible to extract either meaning or actionability from single KPIs.
Not convinced? How could it be bad that your NetPromoter score is up? Here's a common driver of NetPromoter scores: In general, regular customers will have a higher score than new customers. If your business is losing non-core customers as your product set ages, your NetPromoter score could increase. As business declines and the audience narrows, you'll increasingly serve hardcore advocates. It's like talking to people who still use Myspace. If they still use it, they are probably promoters.
What about revenue? Surely, it's impossible for a revenue increase to be bad! Not only is it possible, it's common. Company A creates a remarketing program that sends a 10 percent discount offer to all cart abandons. Unfortunately, consumers quickly game the system and abandon the cart to get the coupon. Gross revenue increases, but net revenue declines due to reduced margins. Short term revenue gains that sacrifice margin are frequent. Poor reporting systems make this type of system interdependency difficult or impossible to spot or understand.
Still not convinced? What about net revenue? If net revenue is increasing, things have to be getting better! Not so. AOL in its ISP days provides a classic example. AOL was infamously difficult to cancel once you had signed up. The reason? Analysts at AOL had carefully measured the impact of difficult cancellation and could prove that making it hard to cancel improved net revenue. Sadly, they didn't measure the resulting impact on brand and customer satisfaction. The brand eroded under a relentless program of short term net revenue optimization.
Give up? We hope so, because there just is no single metric that can be meaningfully interpreted when viewed in isolation. KPIs are not actionable because KPIs aren't individually meaningful.