Reading most web analytics reports, I am often reminded of Samuel Colereridges' epic, "The Rhyme of the Ancient Mariner." Instead of "water water everywhere, nor any drop to drink," I think of "data, data everywhere and one really did think." Far too often I discover that in instead of a web analytics report, I am actually holding a web data report. The problem is simple; there is a difference between web reporting and web analytics.
The web really is the most measurable medium. We are lucky to work in an industry that offers a spate of information to consider. Unfortunately, as any chocolate addict will tell you, too much of a good thing quickly turns bad. Data overload is a problem. Web reporting generates loads of data. Every single action can be measured, recorded, and scrutinized to the 'nth degree. Because we measure everything, we spend inordinate amounts of time recording and sorting it. But reporting that copious amount of data means absolutely nothing. That is simply a data dump that, in and of itself, offers little value.
Numbers don't tell a story. People do. Analysis means understanding the data, thinking about what they mean, and finding a pattern. I liken this to storytelling. After carefully considering the numbers, web analysts translate them into a narrative that even the non-techies can understand. Therein lies the difference between web reporting and web analytics. Reporting regurgitates the what, and analytics explains the why.
Any high school dropout can pull the numbers from Google Analytics or a third-party ad server and paste them into Excel. But only a highly trained high school dropout can interpret those numbers and tell you what they mean. Analysis requires a person see the forest through the trees. Individual data points matter less than the larger narrative. For example, reporting merely lists the number of unique visitors, while analysis attributes the percentage increase to specific factors. Most importantly, after neatly summarizing the information, good analysis includes recommended actions. The next steps are the reason why we do analytics, not simply to measure but to improve too. Distinguish between data reporting and data interpretation; this is the first and most important rule of web analytics.
As anybody who has worked at a large company can attest, only grain farmers love siloes more than corporate America. For our purposes, the problem lies with the chasm between marketing and the IT departments. Marketing manages the paid and earned media while IT runs the website (owned media). So often the webmaster, as part of the IT department, has no contact with the marketing department. This creates a schism between what drove the results and the analysis thereof.
I recall one client who issued a monthly web traffic report like clockwork. Each month, without fail, this report listed the client's top advertising partners as the top sources of traffic. Given the large advertising budget this should have been no surprise. However this report consistently failed to mention that those sites were the top advertising partners. The reason was rather straightforward: The person issuing the reports simply did not know the media plan. The report repeated what the advertising group already knew. Worse, it presented a skewed picture to the C-suite. Don't be that guy; use this article as your analytics' crib sheet.
Distinguish between paid and non-paid referral traffic
Of course paid advertising generates traffic. This should be a no brainer. The more important takeaway is how much traffic paid media produced relative to other efforts. To parse the data even further, determine which sites produced both paid and unpaid traffic. Indeed, search engines often dominate both categories. Paid search boosts Google traffic, but so too can offline media. Often print and broadcast TV have demonstrable effects on searches. Since search volume is often influenced by so many factors, carefully distinguish PPC results from organic traffic. This helps contextualize the relative merits of different media in regards to web analysis.
If the left hand doesn't know what the right hand is doing, don't bother reporting on either. Web analytics requires that all of the sources of traffic (marketing, corporate communications, PR, advertising, e-commerce, SEO, etc.) share their plans. Without all of the correct inputs, don't expect accurate -- let alone informative -- reports. For example, broadcast media generates both organic and paid search traffic. Therefore media planners need to share the TV schedules so that analytics can quantify the effect on search traffic. Spikes in traffic should not just be identified but, whenever possible, sourced. To correlate the media with web spikes, simply overlay the media plan with daily web traffic. Simple charts like that add more value than 10 dense pages of numbers.
Be aware of report creep
Once you have established which metrics to analyze, why they occurred, and what to do next, all of this data needs to be input into a report. For better or worse, everyone likes a good PowerPoint presentation. However it is not uncommon for these reports to grow unwieldly over time. What starts as a one-page dashboard ends up as a 30-page magnum opus. The reasoning is sound if not misguided; if one graph is good, two must be better, three charts even more so. This is typically where web analysis falls prey to web reporting. More charts do not produce more insight.
Keep your dashboard simple
Dashboards are increasingly popular among web advertisers. These are one-page documents that neatly summarize performance. In my opinion, a web dashboard should be no different than a car dashboard. Both should be simple to read and tell you just the most important facts. Driving on the autobahn, you don't need to know every facet of the car's performance. You just need the essentials -- speed, RPM, and perhaps miles driven (odometer). Indeed, overly complicated dashboards would distract drivers and cause crashes. Similarly, a web dashboard needs to present just the basic, most pertinent information. It is tempting to list every web metric ever but that would actually distract from the real issues. To paraphrase Smokey the Bear, "Only you can prevent report creep."
Report only what is important
While this sounds simple, I guarantee that in actuality this is quite hard. The more people who receive the report, the more additional metrics they will suggest. Remember though, too many cooks make the food bland. In my experience, the more the recipient gets paid, the more that he or she really just wants topline data. The CEO doesn't really care about the bounce rate, although the CMO probably does. Thus, you need to know your audience. I will acknoweldge that this is easier said than done. Certainly make the raw data available for those ambitious executives who want to dig deeper. In my experience, good analysis trumps data collection. The more insight you can provide, the less people care about the lists of numbers.
Expect discrepancies and plan accordingly
Web analytics attempts to explain activity using disparate reporting systems. Rarely do the separate clickstream (ad server) and web log (site visitation) databases mesh neatly together. Clicks recorded by the ad server will differ from the number of visits recorded by the site side measurement tool (Omniture, Coremetrics, Google Analytics, etc.) While a click leads to visit, they are not one and the same action. For that reason, clicks are measured differently than visits are recorded. Expect discrepancies and plan accordingly. Therefore create an ongoing baseline of how much the numbers typically differ. Trouble arises when these numbers are totally out of sync, beyond the normal discrepancy margins. Source the numbers presented on each page and detail the time frame they represent.
Define your metrics and be consistent
It is easy to fall into the trap of comparing numbers based on different metrics or time periods. For example, bounce rates do not have a standard definition. They can mean the percentage of people who visit just one page or they can refer to the percentage of people who visit one page for less than 30 seconds. Either way is technically correct, but do not change the metric mid-campaign. Similarly, be careful to compare data over similar time frames. Do not report Monday through Sunday data one week, but Tuesday through Monday information the next week. Even monthly comparison can be tricky. Consider: February has three fewer days than January, which means total monthly traffic might decrease. For the reports to be accurate, make sure the comparisons are "apples to apples" and not "apples to oranges."
Don't let data be your digital albatross. Think about what the data means. Make recommendations based on this insight. Figure out how to present that with mercifully brief presentations. If you can do that, you will be a web analytics rockstar.
On Twitter? Follow iMedia Connection at @iMediaTweet.