ellipsis flag icon-blogicon-check icon-comments icon-email icon-error icon-facebook icon-follow-comment icon-googleicon-hamburger icon-imedia-blog icon-imediaicon-instagramicon-left-arrow icon-linked-in icon-linked icon-linkedin icon-multi-page-view icon-person icon-print icon-right-arrow icon-save icon-searchicon-share-arrow icon-single-page-view icon-tag icon-twitter icon-unfollow icon-upload icon-valid icon-video-play icon-views icon-website icon-youtubelogo-imedia-white logo-imedia logo-mediaWhite review-star thumbs_down thumbs_up

5 signs your data has no meaning

5 signs your data has no meaning Brandt Dainow
VIEW SINGLE PAGE

Not all data is equally valuable. In fact, some of it is downright useless or even harmful. There's a great deal of web analytics data around these days. Most commercial websites now have web analytics reporting (80 percent of those using Google Analytics). Online marketing campaigns regularly assess performance in terms of quantitative metrics such as impressions, CTR (click-through rate), and CPA (cost per acquisition). People have become familiar with technical terms from web analytics like bounce rate, exit rate, abandonment, conversion, and engagement index.


5 signs your data has no meaning


However, with all this data floating around, there's bound to be some rubbish. So there's a good chance you'll be exposed to web analytics data that is of no value to you. It may even be wasting your time or misleading you. Here's how to spot useless data.

Incomprehensible data


You can't use data you can't understand. You probably shouldn't expect to understand every metric that comes your way, but someone should be able to explain your metrics to you if you ask. This is not always the case. It is possible that metrics are being provided that no one understands. This is most commonly the case with reports from ad delivery systems, such as bid management systems, emailers, and other systems that automate placement of marketing material in some way. These are often extremely sophisticated technology systems. However, they are usually operated by agencies staffed with people who lack detailed technical knowledge of how these systems work. There's nothing wrong with this in and of itself, but it does mean most agencies can click a few buttons and produce reports without knowing what's being measured, how, or even what the metrics reported mean. It's just -- "Click, print, email. Hey, presto! We're reporting." This boring and uncreative task is often assigned to interns and other office hopefuls, so the agency never really develops any understanding of performance metrics.


I once dealt with an SEO agency that reported its progress in terms of "the percentage of the relevant search space." The agency would proudly announce to the client that "51.5 percent of the relevant search space has been captured." Each month the SEO agency would report capturing a little more of this search space for the client. The client was happy because the percentages kept going up, but the client didn't really understand what it meant in concrete terms it could design marketing activity around. When questioned, we discovered the SEO agency could not explain what a "search space" was, how it was measured, how you assess relevant versus irrelevant search space, or even how any of this related to something as basic as website traffic.


The data is used to assess the performance of the agency providing it


Many agencies provide the metrics that clients use to assess their performance. Numbers calculated by them are used by clients to decide how much to pay agencies and whether to give them more business. If those numbers are poor, it will cost them business. It's a rare client who's going to say, "Well, that was a really terrible campaign. Your execution was poor, and you failed to meet all the objectives. But you did report all of this honestly, so I'll give you more work."


In the offline world, it is usual to check the numbers an agency gives you if it's going to affect that agency's bottom line. It's considered naïve to assume everyone will tell you the complete truth if doing so is going to hurt them. You don't have to be completely cynical to recognize that sometimes people will spin a little, err in their own favor, or highlight the numbers that present them in the best possible light. Even something as basic as a FedEx delivery sees the number of boxes delivered checked against the shipping note. Yet this seems to be rare thinking in the online marketing world. It's as if everyone trusts everyone completely. Who would have guessed -- online marketing is the most honest and trustworthy business environment known to humanity! Nobody in digital marketing will ever spin, exaggerate, or lie.


If you're having trouble accepting that the digital marketing community is a community of saints, check the numbers people are sending you. It's always possible to produce your own independent metrics to compare with someone else's. If an agency claims to be sending 100,000 people to the site, you can check your own site metrics. If it is placing AdWords via a bid management system, you can check AdWords data directly yourself. It's surprising to see how many agencies withdraw their bids for work when told their performance will be independently assessed.


While I am sure there are honest agencies out there, every time I've checked agency numbers I have found they are higher than mine -- much higher. Typically the numbers claimed have been three to five times what I could verify independently. I don't know if such exaggeration is typical; it's possible I've been unlucky. Maybe you're lucky and all your people supply 100 percent accurate numbers. However, unless you check their numbers against your own, you can't possibly know for sure.

Vanity metrics you can't do anything about


Data is not the same thing as information. Information is data that has value. You don't need data; you need information. Data is only of value if you can use it. There are many popular web metrics that are of no value. People often refer to these as "vanity metrics."


For example, page views are fairly useless. This is just a count of how many web pages were viewed on a website. There's nothing you can do to increase page views, and the count of page views says nothing about your site's performance. There's no real value in this number because it's just a product of other factors that you can do something about, like the number of pages on the site and the number of visits. You can increase the number of visits, and you can add or remove pages, but you can't directly do anything about page views themselves. Page views are useful only as a step to calculating the average number of pages per visit.


Web analytics systems have to be comprehensive. They need the capacity to serve up whatever information anyone may happen to want, so they need to cover a large number of measurements. People don't need most of these numbers most of the time, they only need a few numbers that relate directly to current activities. However, there is a tendency to produce comprehensive reports that flood managers with a tsunami of data. Often this is because the people producing the reports don't know what the people reading them are concerned with or what they need. When I design a web reporting system for a client, I find myself dumping reports more than adding new ones.


After reading a good web analytics report you should emerge with a number of things to do. If your web analytics reports don't lead to a to-do list, you're not deriving any value from them; they're just vanity metrics.


Compared to what? Raw numbers mean little.


"Hey, Barbie. How's your website performing?"


"Well, Ken, I got 100,000 visitors last month."


"Wow, Barbie, that's the same as my site. Isn't it great!"


"Sorry, Ken, but I usually get 1 million visitors each month. A mere 100,000 means my site is in serious trouble."


"Gee, Barbie, that may be bad for your site, but I usually only get a few thousand visitors each month. A massive 100,000 means stellar success for me."


Web analytics numbers are always relative to something else. They can be up or down on the previous month or previous year. They can be higher or lower than your competitors' numbers. On their own, isolated from trends or their environment, they don't mean anything.


The most common place raw numbers are used in isolation is pay-per-click bidding. The majority of bids are placed without reference to the income they will produce. Many AdWord bidders are paying two or three times what the click-through traffic produced earns.


A web metric is meaningless in isolation. It needs to be used in a trend or a comparison. It has to tell you something dynamic -- how you're performing over time, or how your performance compares to that of others.

Data doesn't match


Compare the same number from two different web analytics systems, and you'll almost certainly get two different figures. This is a big problem, and it most commonly occurs when you start independently checking the numbers your agencies or outlets are claiming. Whenever two different web metrics systems compare numbers, you inevitably get disagreements. The problems stems from the fact that there are no standards regarding how web metrics technology should be measured or calculated. Everyone does it differently. Different systems can even use the same name for completely different things.


A common example can be seen in ad assessment of multi-touch conversions. A multi-touch conversion occurs when someone makes multiple visits ("touches") to the site before converting. By default, Google Analytics attributes the conversion to the source of the last visit. Ad people like to attribute conversions to the first visit. Their logic is that if the ad introduced the visitor to the site for the first time, the ad should be credited for the eventual conversion. If this happens to present their work in the best possible light, I'm sure that's just a happy coincidence. It's debatable whether you can really credit an ad with a conversion if there are six months and ten visits in between. To be really accurate we should attribute something from every visit in the multi-touch sales sequence. However, the result is that AdWords conversion reports from Google Analytics won't match the numbers from the advertiser's ad management system.


Google Analytics doesn't make this any better by sampling. Once your traffic gets above a certain level, Google Analytics stops counting every visit. Instead, it starts taking a sample, keeping its own workload within limits. The busier your site, the smaller the portion of traffic Google will sample. It estimates the missing data from the samples. Google sampling can range from 50 percent of your traffic to less than 10 percent. Naturally, the more sampling, the less accurate the estimates. It's very difficult to compare sampled Google Analytics data with complete data from another system because you have no way of assessing the accuracy of Google's estimates. However, if you've got so much traffic Google Analytics drops into sampling mode, you've probably outgrown the free web metrics model. With that much traffic, you should be able to justify paying for something better.


This is never a hopeless situation. Once you understand how two systems are calculating the same metric, it's always possible to bring them into line. For example, Google Analytics can be configured to use first-touch attribution very easily, bringing it into line with ad delivery systems. Sometimes you have to do this manipulation outside the two reporting systems in something like Excel or Numbers. However, once you understand how the two systems work, it is always possible to bring them into alignment.

Summary


Let's summarize this by turning it around. What are the signs your data does have value (that it is information and not just noise)?


Your organization understands it
You don't need to understand every web metric, but there should be someone in the organization who can comprehend each one. Understanding a metric means knowing what it measures and how that measurement is technically calculated.


The data is independently verifiable
Where people produce the data that is used to check their performance, that data should be checked.


Actionable metrics
Web metrics should be directly relatable to things you can do. Data you can't act on amounts to mere vanity metrics.


Comparative numbers
Web metrics should compare performance over time or against comparable figures (such as benchmarks and competitor performance).


The same numbers from different systems should match
This is unlikely to be the case when you first look, but it is always possible to achieve with a little effort.


Remember, you don't want data; you want information. That means data you can understand, trust, and do something about.


Brandt Dainow is the CEO of ThinkMetrics.


On Twitter? Follow iMedia Connection at @iMediaTweet.


"Computer gibberish printout" image via Shutterstock.

Brandt is an independent web analyst, researcher and academic.  As a web analyst, he specialises in building bespoke (or customised) web analytic reporting systems.  This can range from building a customised report format to creating an...

View full biography

Comments

to leave comments.