Digital marketing is an uncertain environment. Apart from the new channels and modes of interaction with consumers, there's also the ability to record, measure, and report on everything. From mouse clicks to Facebook chats, everything that happens online is recordable, measurable, and reportable. If we didn't know it before (and most of us did), a number of intelligence agencies have shown us that recently. Since we've now entered the age of big data and data mining, there are good reasons for recording and processing as much information as you can. However, no human organization can possibly use, or even want, all of the data all of the time. To misquote a great man, "You need some of the data some of the time, but you never need all of the data all of the time."
I discussed this with Bryan Eisenberg, one of the fathers of web analytics, who has expressed similar concerns in his "Smarter Data Manifesto." He said, "Don't measure anything that you can't find a direct line of sight back to your financial statements. You manage by what you measure so focus on those things that affect the bottom line directly and over the long term (brand metrics)."
What you need or don't need is going to depend on your objectives, which will in turn determine what actions you take. So it's axiomatic that you shouldn't measure stuff you can't do anything about. Don't measure form performance if you're not going to change the form, and don't measure Facebook activity if you don't have a Facebook campaign. In addition, there are a few things that people like to measure that are a complete waste of time no matter what your objectives are and no matter what you can action.
Social activity is the most blatant area of difficulty. At the foundation of web analytics is a void we can never fill. We can never really know what sort of person is behind the behavior web analytics records because people don't visit websites, devices do. If you have a look inside the data your web analytics system is recording, you'll see that the "people" it counted were really just IP addresses and system identifiers. You hope that system was being operated by a human, but you can't tell. You hope it was just one person, but if people swapped over in the middle of the session, you'll never know. It's also possible that there was no person behind the device. There are plenty of robots out there driving browsers now.
These robots have always been around, but it's never really been a major problem before now. Before the days of social media, there wasn't much to be gained by writing a program that pretended to surf the web like a person. However, these days, poor web analytics practice has created an entire industry designed just to serve up useless data. The problem stems from counting volume without any reference to quality or segmentation. The vast amount of traffic on the web disguises how varied the mix is within that volume. Raw totals are rarely of any use. The lack of detail disguises so much that they might as well be completely wrong.
I am going to suggest five things you can stop measuring. By all means continue to gather the data, but don't bother reading the reports. These are the popular metrics in social media and search marketing that people spend a great deal of time and money chasing, even though they are a total waste of time.
Let's start the dump list with counting Twitter followers. It's a completely pointless exercise. Think about exactly what the number of Twitter followers represents. It is nothing more than the number of Twitter accounts that have connected. It's not the number of people actively receiving tweets. Up to half of all Twitter accounts are inactive, while many are just spambots. It is estimated that two-thirds of the Twitter fans of many celebrities and politicians are fake. So we have to ask: Why would someone purchase fakes on a system whose sole function is to communicate with people? We know, of course, the reason someone would do this is because many people think the number of followers means something -- that it represents people. That's fine if you're a low-level blogger and want the naïve press to think you're someone worth talking to once in a while. However, the raw number of followers is of no use for digital marketing. Not only has this misperception by the press spawned the Twitter spambot industry, but it has now also reached the level where an anti-spambot industry is developing, as we see with companies like Statuspeople and Twitter Audit.
The only real way to assess the impact of Twitter on a brand is to use conversation analysis systems, such as Transana, to determine the sentiment being expressed. A group of genuine human Twitter followers will include detractors as well as supporters, so even if you do count genuine humans, you still need to understand their feelings to understand what impact Twitter is having on market sentiment. The social influence estimator, Klout, takes a small step in this direction when it considers retweets and mentions, but even these can be robotically generated, and it still doesn't take into account what people are saying about you within those mentions.
Facebook "likes" are everywhere. "Likes" can indicate a real engagement with your brand or just momentary amusement. There have been plenty of attempts to calculate the value of a "like." Estimates by companies that sell social media marketing range from $3.60 to more than $100. Estimates by brands themselves range from $0.20 to $1. Estimates by independent web analysts tend to put the value at zero. As Forrester Research's Augie Ray put it, "The answer is zero -- unless and until the brand does something to create value."
The vast majority of every website's traffic is transient. This means the vast majority of people who click the "like" button will never return to the site of their own accord. If you want to get value out of them, you need to actively do something. Given the diverse range of people and their intent when "liking" you, there's little you can do that will appeal to them as a single homogenous mass except offer them cash. If you want to do anything more sophisticated than that, you need to segment them and respond accordingly. This means the important thing is how many fans you've got in each segment, not the grand total.
I'm sure it's no surprise by now that I'm not a fan of social mention reports. A social mention is nothing more than the use of a keyword in some form of social media. The problem is, again, one of context. Without knowing how the word was used, in what context, and to express what sentiment, a mere count is meaningless. It is possible, with some serious effort, to tune some social mention systems to consider surrounding phrases and other contextual clues, but it requires a great deal of effort, and few social mention systems are up to the task.
Search engine optimization (SEO) is another area where there is a great deal of confusion. SEO consultants tend to fall into two opposing camps depending on how they think Google ranks websites. One camp favors content, while the other favors links. The link group holds that what's most important is the number of sites that link to you, so many chase a link count. As we saw with Twitter, this created a spam industry. In this case, the culprits were link farms: websites that existed solely to provide thousands or even millions of links to their customers' websites. All of this fuss derives from the premise that every link was created by a human who had reviewed and approved the content at the other end of the link -- an endorsement. The people behind the search engines aren't stupid. Some of them are even ex-rocket scientists, so once the link building industry arose, Google commenced a cycle of changes designed to filter out what it calls "low value" links. For example, Google now counts the total number of links coming out of a site and spreads the site's total value amongst them. This means a link from a site with only five links is 200 times more valuable than a link from a site with 1,000.
Unless you have very few links and can check them manually, or you are scrupulous about your link building activities, your link count will mean nothing because it covers different types of linking sites, each with a different value. A single number tells you nothing about the impact on your listings. A large number of poor links can even count against you. In such cases, it can be better to have a few dozen hand-built links than thousands in link farms. If you want to count links, you need to segment them into different channels and different types of sites, so you can determine the value of the link each site provides.
Search engine visibility
One problem with assessing SEO performance is that it is incredibly fragmented. Activity within a search engine is segmented by search phrase or keyword. Most sites will have thousands or even tens of thousands of different keywords delivering traffic to them. The site's ranking can vary for each and every one. Search engine optimization is, therefore, about improving the individual rankings of each of these phrases. This makes assessing the overall state of a site's listings very difficult. It's hard to make sense of a report containing a wall of phrases and their ever-changing rankings -- and even harder to gauge progress. The industry's response has been to develop the "Search Engine Visibility Score," a single number based on a website's rankings for many phrases. There's no rigid standard to determine how this is calculated, but the basic idea is that a No. 1 listing is worth a certain percentage, a No. 2 listing is worth something less, and so on down to No. 10 (no one cares what happens on the later pages). The calculations are set so that if you had all of your keywords at No. 1, you'd have a score of 100 percent.
Unless your keywords are all of equal value (and they're not), this doesn't tell you much. Many performance indicators, including bounce rate, form abandonment, average order value, engagement, and conversion rate, vary from search phrase to search phrase. You can only focus on a limited range of keywords at a time. It's kind of comforting to measure changes in the search engine visibility score for these, but it's not much help for actually taking action.
There are two threads running through the problems with these metrics. The first is the assumption that online activity represents human activity. That was true when the web started, but it hasn't been true for years. It didn't really matter until social networking arose and it became worthwhile to create software that could pretend to be human. In the current state of affairs, numbers that don't filter out the robots are misleading. The other thread is that of totals. Total numbers, for many web metrics, aggregate so many different segments that they become meaningless. Such numbers cannot serve as a platform for planning action. If you're not going to use these numbers to do something, why bother to give them any time in the first place? Remember: You don't need all of the data all of the time; you just need the stuff you can use now.
On Twitter? Follow iMedia Connection at @iMediaTweet.