One of the original promises of Internet advertising was the ability to target ads with laser-like precision - to be able to display advertising to only the most relevant audiences with zero waste.
To an extent, basic forms of online ad targeting delivered on that promise, especially keyword-triggered ads and basic forms of registration-based targeting. When a single targeting filter is used, Internet advertising comes closest to delivering on its promise.
However, it's rare that advertisers want to use only one targeting filter. Age, sex, household income, education, the presence of children in the household - these are all targeting criteria that are well-known to traditional advertisers. And they tend to use them in combination with one another. Most television advertising buys, for example, are guaranteed on at least two qualifiers - usually age and sex.
Targeting is Data-Driven
Internet marketing is still a pretty new endeavor for a lot of clients, so many of them are still looking at targeting the way it has been done for these last few decades.
"Since Internet marketing is still new to a lot of clients, they tend to focus on the basic targeting attributes that are available from other media sources," confirms Suzy Lahey, product marketing manager, Targeting & Direct Marketing for Yahoo. "These are targeting concepts that are easily understood and create consistency of campaign audiences across media channels."
However, the Internet lends itself to such data-rich components of advertising activity, using traditional targeting filters just scratches the surface on the kinds of targeting options available.
In the online world, many advertising venues often have trouble engaging more than one targeting filter. This is a function of the percentages of site users who provide the relevant information for advertisers to be able to reliably target. One of our media buyers here at Underscore learned about this early in his career - the hard way. A site had contracted with him to deliver a geo-targeted keyword buy. The buyer had figured that geographic targeting information was available for all of the site's visitors. In reality, the size of the audience for which geography could be reasonably ascertained was much smaller. As a result, the potential reach for the buy was decreased drastically - to the point at which only a few dozen impressions could be served per month.
We've come a long way since then, though. Site publishers are learning to make better use of internal data assets and external databases, both of which contribute significantly to a site's ability to offer targeting to advertisers. One technology provider that is helping publishers to leverage those data assets is TACODA Systems.
"TACODA clients are able to use its proprietary technology to extract actionable user data, often combining it with offline data, to obtain precise profiles of who is on their sites," says TACODA CEO Dave Morgan. "Groups with similar interests or demographic characteristics can be segmented to provide high efficiency for products or services that require a 'class' rather than 'mass' pitch. A couple of TACODA-enabled sites like to do extra number crunching for marketing with a separate tool, but they'll take that data and feed it back to TACODA for targeting in its ad server, email server, or content server."
The more publishers can understand about the people who visit their sites, the better targeting options they can offer to advertisers. Online advertising programs are becoming increasingly data-driven.
The Basics on Targeting and Data
There are three basic types of data that publishers can use to beef up their targeting abilities:
- Observed data - Data points that can be gathered by direct observation of a user. This can include content sections visited, exposure to advertising, interests, and many of the valuable bits of information that are gleaned from observing HTTP requests, such as IP address, browser version and operating system.
- Declared data - Data that comes from direct input from a user. This can include things like the ZIP code submitted by a user during registration, answers to surveys and questionnaires, and opt-ins for certain types of information.
- External data - Data that comes from external sources. For instance, some newspaper sites have been taking offline subscriber data and marrying it up with user profiles they've collected online, giving them numerous data points by which to target in the future.
Collected responsibly, these data types can all contribute to a robust targeting offering.
Balancing Privacy and Targeted Communications
As the Internet has evolved as a communications medium, there has been significant concern on the part of consumers about how advertising is targeted online. Consumers are not necessarily comfortable with Websites targeting advertising based on details the site compiles about them. Although profiling is standard practice for the marketing industry, especially the direct-response business, profiling is less transparent to the consumer in the online channel. Thus, the battle over privacy often flares up in the form of very public skirmishes in the online arena.
"Companies can either have PII (Personally Identifiable Information) or have consumer insights -- but if combined that is when some consumers and privacy groups have an issue," says Scott Eagle, chief marketing officer of behavioral marketing company Gator. "Hence some portals and search engines have come under fire for having both."
Eagle says that Gator does not collect PII, and that this has kept consumer issues to an absolute minimum. "Since we do all our profiles on an anonymous basis -- we have no PII -- there have never been any consumer issues in the past five years since the start of our model," he says.
Stellar Targeting Requires Stellar Research
Knowing more about an audience is the key to being able to target it effectively. Thus, research and the technology behind targeting are intertwined as the industry moves closer to one-to-one targeting.
"Both research and technology are required for successful targeting," says Morgan. "Technology is only as smart as what you put into it, and what passes for 'research' is sometimes a restatement of numbers that have been derived mechanically without serious analysis. The best tools do the difficult extractions for you and automate the mundane tasks, but they also require you to know your advertiser's goals and how to use the technology to reach them. Sometimes adding panel-based data from comScore Media Metrix, or mapping your zip records to PRIZM clusters, or even adding your offline subscription list, can give you richer audience profiles. You need to know why and when and how you should do this, and you need tools that are flexible enough to handle it and do the required targeting once you've selected and analyzed the data."
Lahey of Yahoo agrees, stating that research is the more important of the two. She sees better research as being the key to improving the industry's targeting capabilities. "Emphasis should be put on understanding the breadth of online data and what that means for marketers-- options available and performance associated with it," says Lahey. "In addition, the industry should focus on finding palatable ways to integrating offline and online consumer activity to get a more comprehensive view of the consumer and better understand cause and effect relationships and the impact of all the media working together."
She points out that Yahoo! Consumer Direct, supported by AC Nielsen, is an opt-in panel of users culled from ACNielsen's HomeScan panel, allowing Yahoo! and ACN to analyze the online activities and offline purchase patterns in order to better model and target users and advertising.
Truly, the ways in which we will be able to target advertising in the future will be limited only by the imagination. Advertisers will need to be well-versed not only with their target audiences, but also with how segments of those audiences behave with respect to the product or service being offered. Media planners will need to understand their targets from not only a demographic standpoint, but also from a behavioral, psychographic, and interest standpoint. Different types of potential customers will react to messaging in different ways. Currently, the technology to target and segment these different types of consumers exists, so media planners and advertisers will need to become experts in audience segmentation.
"In order to have scale - tens or hundreds of millions of user profiles and insights-- there needs to be technology deployed to help marketers/advertisers understand consumer needs with little/no consumer involvement," says Eagle. "Gator's behavioral marketing engine is one of these solutions. Another might be Google's use of page crawlers."
Targeting That Performs
One question you might be asking yourself is, "Do the results justify the cost of the technology to implement complex targeting?" After all, if the investment of time and technology to improve targeting produces only a tiny lift in campaign effectiveness, it might not be worth it. Results, however, have shown that such investments are wise.
"One TACODA customer, Belo Interactive, has more than a million registered users for its DallasNews.com site," says Morgan. "Combining registration data with observed behavior and interests in AMS (TACODA's Audience Management System) has helped the site to improve results by an astounding 2200% in a recent campaign that has been reported extensively. But note that it is the combination of declared and observed data plus the ability to target specifically against the resulting profile using the Audience Management System that drives the performance."
Similar results have been reported for targeted ads offered by ad sellers such as Gator, WhenU and other behavioral targeting plays.
Lahey indicates that Yahoo! has had good success, with costs being far outweighed by the benefits. "For high-ticket items that are considered purchases, we've seen a high level of success with purchase intender targeting where we can identify users that are 'in-market' based on the amount of time they spend researching online in a concentrated window," she says.
Currently Yahoo! has these kinds of targeting programs built around purchase intenders for Autos - e.g. users about to purchase a car; Travel -- users about to buy an airline ticket or book a hotel or rent a car; and Shopping-Electronics.
Moving Toward One-To-One? Or Moving Toward Something Better?
One-to-one marketing has been a promise of Web advertising since its early days. The notion of using technology to facilitate the building of meaningful marketing relationships is attractive, but some would argue that targeting technologies are moving us toward something closer to mass personalization.
"One-to-one is not an economical model for any medium," says Morgan. "What works is efficiency - that is reaching a narrowly defined audience with precision with an appropriate message. The Internet is the only medium that can provide that kind of precision to large audience segments."
Others might argue that the core promise of one-to-one versus that of mass personalization is more of a semantic issue.
"We are all proving that one-to-one marketing to hundreds of millions of users is doable -- and in fact, may be the highest performing ROI for advertisers," says Eagle. "The staggering growth rates of this category and companies support this POV."
An Adjustment in the Way Planners Think?
Any media planner would jump at the chance to increase the effectiveness of his online media buy twofold, much less 2200%. They can do this by adjusting the way they think about audiences. It requires adding to the traditional arsenal of demographic targeting options - adding behavior, psychographics, lifestyles and interests. It will likely be an adjustment that takes time, but remember that the first to adapt reap the rewards of the first-mover advantage.
Rip-off #2: Talk rubbish
If you want to sell garbage SEO services, it is important to understand that most of your customers have no idea how SEO works or what you do. This means you can bamboozle them with made-up jargon.
My particular favorite, used by one of the largest SEO agencies in the world, is to talk about "capturing percentages of the search space." I recall reading with joy the line in one of their reports to a client:
"The site has successfully captured 10 percent of the search space for this retail sector."
This is pure genius! It sounds great, makes the client feel important and powerful, makes you look good, and yet it communicates absolutely nothing. Let us examine this work of genius in more detail and see what we can learn from it.
We see that the client has captured 10 percent of something called a "search space." The client presumes that this means search engines. But which ones? All of them? Unlikely. Some are very small, and many of them do not work in English. Does this include national variations of major names, such as Google.za (South Africa) as well as Google.com? So to talk of search space is meaningless unless it is defined. What does capturing 10 percent mean? Is this 10 percent of the results? How could anyone calculate that? How many pages are we surveying? Are all positions of equal value? Is No. 1 worth the same as No. 10? Or No. 100? Does this mean the client site is listed 10 times for every 100 results? And how many pages of results were reviewed?
You see the genius? Without a definition of what 10 percent means, how that number was calculated and which search engines are included in this "search space," the client learns nothing, yet manages to feel good.
Should the client actually ask for definitions of the search space, you can just throw him a list of search engines. However, if the client asks how you calculate 10 percent, things get difficult. The best defense is to hide behind intellectual property rights by telling the client the formula is part of your unique expertise (see Rip-off #3).
Another lovely piece of made-up terminology the above-mentioned company uses is to distinguish between crawl maps and site maps. Everybody knows how to create a site map, but only this company knows how to create a crawl map. In fact, since there is no such thing, only this company even knows what it is (and it's not telling). Having convinced the client a crawl map is important, the client will discover that none of his technical staff know how to create one, or even what one is. This proves the superior expertise of the SEO company and forces the client to use its services. Of course, you must never supply a crawl map or explain how to make one. Instead, you must tell the customer that it is proprietary technology that you cannot reveal (see Rip-off 3).
Rip-off #3: Hide behind intellectual property rights
We all know that the techniques required for SEO can be found in many books and on many websites. Luckily, the customer does not. If they ask for proof that you have done something that you have not, try to hide behind claims of intellectual property. I recently saw this done with regard to link building. A company had been paying for link-building services but could see no benefit from them. Its SEO company claimed to be gathering 10 to 20 links per month. This is a carefully chosen number -- not too many to seem extravagant and not too few to seem lazy. When asked to provide a list of the links it had obtained, the company refused. The service provider explained that the sites in which it had placed links had been carefully identified through research using its SEO expertise. The company said that it could not provide this list because it would threaten its carefully created intellectual property. Believe it or not, the client accepted this lovely piece of legal waffle -- and continued to pay for the services.
Rip-off #4: Offer meaningless guarantees
Nothing works to allay a prospect's fears or doubts better than a guarantee of success. The skill lies in offering something that will cost you nothing. My particular favorite is the "we'll do it again if we achieve nothing" guarantee. All you have to do is tell customers you will give them 12 months of free service if you do not achieve their goals by the end of the first year. Then, you do nothing for them all year except send them invoices and collect their money. At the end of the year, when they say they are unhappy, you say, "no problem, we'll give you another year for free." Then you do nothing for the client for second year. Sounds too easy to be true? I didn't make this idea up myself; I saw it being used by a very successful medium-sized SEO company.
Another guarantee that works well is to guarantee No. 1 placement in 75 percent or so of the world's major search engines. The trick to this one -- and this is a very popular trick -- is to provide a list of target engines that includes very minor search engines. These are ones that no SEO company bothers with because they have insignificant market share. Of course, you include Yahoo, MSN and Google in your list. However, you don't waste any effort on them; they are too hard because there is too much competition for placement on them. You have to know your business to succeed with these three. However, if you include them in a list of 10 or 20 other search engines, all you have to do is get decent places in the others to meet your stated goal. Of course, the client will see no benefit, but he can hardly sue you; who wants to get into a legal debate about the definition of a "major" search engine?
Rip-off #5: Focus on page rank
Google's page rank is a measure of a site's importance on a global scale. Google uses it to determine how frequently to visit a site and what percentage of the site to index on each visit. It is not the sole determinant of such things, but it has a major influence.
Fortunately for us, many people believe that page rank influences a site's positions in the listings. This is not true. It does not matter how important a site is. If it genuinely does not match a search phrase as well as a less important site, the less important site will be listed higher.
People focus on page rank because it is visible. Most clients (and many SEO engineers) believe page rank is an indicator of your positions in listings. Don't disillusion them. Since page rank changes slowly, you can consume a large amount of their cash before you are expected to achieve anything.
Rip-off #6: Focus on link building
A site's page rank is largely determined by the number and quality of the links that lead to that site. This has led many people to believe that the process of search engine optimization is to first ensure the site is well optimized, then to focus on getting links to it.
The critical thing that most people miss is that the quality of the links is more important than the number. Google assigns a nominal value of one to all the links coming out of a site. Thus, if a site has 10 outgoing links, each of those links has a value of 0.1. If a site has 100 links, then each of them has a value of 0.01. Furthermore, if the site has too many links, Google downgrades it because it is obvious that the links are there simply to try to influence Google. Such a site is known as a "link farm." Google hates link farms.
Since clients do not realize the importance of link quality, you can focus on getting as many links as possible, ignoring quality, and still keep the customer happy. You can work with link farms (there are many in India), thereby getting the customer hundreds or thousands of links with relative ease. This generates the illusion of a great deal of work -- for which you will be paid. Of course, the customer will gain no benefit and may even be downgraded by Google for being linked to by link farms. However, in this case, you can truthfully say you have done what you were paid to do and blame Google.
If you want to make it even easier, you can create your own sites and stuff them with links to your clients. Some SEO companies even create software to do this automatically, generating huge quantities of links with no labor costs. Of course, Google is not stupid; dozens or hundreds of sites sitting on the same server, cross-linked to each other and to clients, often with more links than content, just smells bad. It is pretty obvious to Google that this has been done by a single organization solely for the purpose of trying to trick Google. Google hates being tricked, so it will ban these sites as fast as possible. That's not a problem for you -- simply buy some more cheap domain names and move the sites to a different server. If you plan to use this strategy, as some major SEO companies do, you need to plan to migrate your whole site complex to a new server every year or two.
Rip-off #7: Spam the search engines
Google lists a number of SEO techniques that will get a site banned. Some years ago Yahoo and Google issued an announcement that they share blacklists of banned sites. The reason search engines ban these sites is that there are some tricks that work. Because they work, Google has no defense except to ban the URL. This is one of the reasons why Google employs thousands of people solely to check results.
A common technique you can use is keyword stuffing. To do this, you simply place a large number of keywords at the bottom of the client's pages. We used to be able to do this invisibly, making the text for the keywords the same color as the background. However, Google detected this and now drops pages that contain text of a color close to the background. This is why you will often see sites with a huge number of plainly visible keywords at the bottom of the page. If you want to be really effective, each of these keywords should be linked to a relevant content page in the client site.
Some SEO companies generate keyword-stuffed pages themselves on their own sites and then do automatic redirects for visitors to the client site. Thus, the search engine sees the SEO page while humans see the client's site. This technique is a variety of "cloaking." Cloaking involves serving one set of content to search engines and different content to visitors. There are a number of techniques that can be used to do this. All of them work. Again, this is why Google employs many people to manually check results.
Some SEO companies invested in cloaking technology -- systems that automatically generated cloaked pages or entire cloaked sites. Of course, sooner or later they would get detected by the search engines and have their sites banned. Often the client's sites were banned too. These SEO companies would simply declare their businesses bankrupt so they didn't have to return any fees. They would then reform under a new company with a new name and simply start again. I saw one bunch of people do this three times before anybody in their area cottoned on.
These techniques offer you a toolbox of tricks with which to extract money from naïve SEO clients. To be a successful rip-off SEO company, you need good salespeople who can gather new customers on a regular basis because you won't get much repeat business. You also need to lock clients in with one-year contracts. This is not too difficult because most people understand it takes many months, if not years, to achieve results in SEO. It is therefore unfair to judge any SEO company -- even legit ones -- on the strength of a few months' work.
Of course, if you are operating like this, it is absolutely critical that you do not let potential clients talk to existing clients. Never provide referrals! Talking to existing clients of an SEO company is the best way to determine the quality of the company's work. If you're no good at SEO, don't let this happen to you.
Many SEO companies have built hugely profitable businesses using the techniques I have outlined. Some of them are major international brands. The critical thing to avoid is customers who have read this article. They will be extremely difficult to fool.