iMedia Connection

Announcing the Sugarshots Campaign!

Doug Schumacher
 
 
Campaign Details:
Client: Sugarshots, Inc.
Agency: Basement, Inc.
Ad Network: 24/7 Real Media
Ad Serving + Tracking: Atlas DMT
Site Analytics: Think Metrics
 
Useful links:
 
 
 
Editor's Note:
We are delighted to launch our iMedia Case Studies with Basement, Inc. and Sugarshots. We have also been extraordinarily fortunate to have widespread industry participation in this project even before its official start. 24/7 Real Media has generously -- and at great pains -- donated all the ad inventory for this test. Think Metrics in the U.K. has pitched in with the Web Analytics package. And Atlas DMT has contributed the ad serving and tracking tools.

As Doug explains, we're calling these case studies "Open Source Marketing," and we mean that sincerely. Over the next quarter, you'll see a chorus of contributors to this project, and we invite you to lend your voice to this chorus. If you would like to participate, please send email to iMedia Associate Editor Emma Brownell.

The series of tests we'll be running over the next 12 weeks are really about trying to gain a deeper understanding of one thing: which ads work and which ads don't. Not what a focus group thinks works. Not what the client thinks works. Not what the creative director thinks works. It's about what a million or so potential customers think works.

To achieve that, we'll be viewing ads in the nakedness of their own performance metrics. Like people, most ads don't look good naked. But online has a definite knack for stripping away everything but the grisly truths, like cost and acquisition-based metrics. So that's what we're going to show you: grisly truths, in plain, up-front detail.

This is what Dave Chase of the Altus Alliance has so aptly tagged Open Source Marketing. In actuality, it's what we go through in online marketing every week. Only instead of 10 people looking at the charts, this time we're inviting the entire interactive marketing industry.

Why test creative online?

This approach to testing wasn't possible a few years ago. Or at least it wasn't practical. While online advertising metrics have been around since the first banners, the big change came in the past several years with post-impression tracking.

At its inception, post-impression tracking was expensive -- prohibitively expensive, given the ravaged state of online advertising in the early 2000s. But now it's de rigueur.

The remarkable thing is that there are companies that still don't use it.

Granted, there are multitudes of ways to test almost anything. And I'm not slamming other research methodologies. Most of them provide good insight at varying levels of relevance. But strategic and creative concept testing has never really worked well in focus groups, the predominant method for testing them. That's my opinion, and the opinion of many others.

(I recently reviewed Gerald Zaltman's book, "How Customers Think" for the iMedia Book Club. If you use focus groups to test concepts, read that book.)

Online testing provides what focus group testing doesn't: a natural test environment. Online advertising happens in real time, in real life. It's real, and so the results are more likely to be real, too.

Online advertising is also relatively inexpensive. From producing the creative to generating a small media plan, the costs are marginal compared to most advertising budgets. Ironically, you'd probably spend more on focus groups than on an entire online test. And at the end of the online test, the campaign will already be up and running.

To be fair, no test is ever perfect. It's tough to test for a campaign that simply has to be shot with a supermodel on a remote beach in Antigua. However, there's a lot that can be tested with online. And if approached methodically, the results just might lead back to that photo shoot anyway.

Sugarshots: the company and the product

To launch iMedia Case Studies, we've selected a liquid sugar product called Sugarshots. This is a real, new CPG product in a new product category. While launching new products and categories can be tough, there are several reasons why Sugarshots is a good product for testing.

For one, we don't have to worry about past brand experiences weighing on the product.

Also, liquid sugar is a low-involvement purchase. It is conceivable that at least a few people will be willing to make an online purchase without previous brand or category experience.

Lastly, while there's little known demand for the liquid sugar category, this is a product that should appeal to a broad number of consumers across a range of demographics. We can therefore use a broad media plan, while more narrowly defining the target audience through monitoring response rates.

The testing construct

Theoretically, a test should start at whatever point your confidence in your research drops off. For a new product in a new category, that happens pretty quickly. So while we have some historical data on sweetener advertising -- as well as some Sugarshots research -- we're essentially starting from square one.

As such, we're going to be doing 11 different tests, divided into four phases.

Phase 1: Strategic Foundation

Our goal is to determine which of several foundational strategies will get the best reception. We'll cover more on each of these areas in the weeks to come. At this stage, however, our goals are to determine core messaging strategies and, with luck, to identify how different market segments respond.

Phase 2: Tactical Drivers

Tactical drivers are the blocking and tackling components of the campaign. What size of media units work best? Do the ads need a strong call to action? Will a visual of the product in use be more effective than a straight product shot?

Phase 3: Emotional Drivers

This stage builds heavily upon the campaign intelligence that's been gathered to date. It's a little more abstract, as it gets into the emotional aspects of the message. We'll pit an ad promoting a positive experience against one promoting the avoidance of a negative experience. We'll test polar lifestyle approaches. The idea in this phase is to cover a broad range of styles, placing much greater emphasis on the creative execution.

Phase 4: Extended Analysis

Once we've finished Phase 3, we'll take an additional look at using some of the optimization tools that the sponsoring network -- 24/7 Real Media -- has to offer. While these optimization tools are very effective, they could skew our early results. So we'll introduce them after we've established our strategic and creative baselines.

Once there, though, the optimizers can really boost a campaign's performance.

Methods of evaluation

We'll monitor a lot of data points throughout this campaign, but here are the key points:

Final notes on the testing model

There are many ways a test like this can be structured, depending on a number of issues related to the product or brand's situation. This approach is designed to address questions pertinent to Sugarshots.

Although this is a 12-week test, in actuality, monitoring the performance of campaigns and trying variations of creative should be an ongoing practice.

There are a number of details to be considered with an online test: The creative presentation, frequency caps, the network or media plan employed, the sequence of the creatives. Those are but a few factors that can skew your results one way or the other. As long as they're accounted for, the results should provide reliable results.

Lastly, this is about audience participation, and our aim is to strike a dialogue with people in the industry and get your thoughts on this and other testing methodologies. The potential for online testing is enormous, and we're interested in any input that can further enhance and demonstrate the capabilities of online advertising.

Thanks for reading, and we hope you enjoy the next 12 weeks!

Doug Schumacher is the President of Basement, Inc.