Why your marketing intuition is probably wrong

Most of us grow up and go through school with a decided lack of enthusiasm for tests. We worry about them, we study for them, and we sometimes fail them. Typically they weigh heavily on our academic success or failure. Who hasn't had a dream as an adult where they show up to class only to learn to their dismay that there's a test that day for which they haven't prepared? I know I have. As we grow into adulthood, the word takes on new meaning to represent a trial or tribulation (e.g., "That long boarding process for my flight tested my will to live").

With that in mind, perhaps it shouldn't be surprising that we often don't spend the needed time and attention testing various aspects of our email marketing programs. After all, we've been conditioned to have an aversion to the word. But that's no excuse! Personally, I've always been a big believer in testing email marketing efforts. And that holds true for every different kind of marketing email, regardless of whether it is intended for customer acquisition or customer retention, or whether it is part of a campaign or a triggered email.

The importance of testing your emails was brought home to me again recently when I participated in a keynote session devoted to email testing at an email marketing conference. During the course of an hour, my co-presenters and I shared 14 different tests with the audience. The tests ran the gamut from opt-in forms and landing pages to subject lines and template designs. The audience was composed of email marketers, email agencies, and vendors. Using their cell phones, they got to vote as to which test won before we revealed the actual results. (On a side note, I used Poll Everywhere to do the voting. It's an amazing tool that shows text voting results in real-time in PowerPoint. And no, I don't have a financial stake of any kind in the service!)

Anyway, this august body of email mavens only guessed the right result 50 percent of the time. That's right: In one-half of the tests, they wrongly predicted the winner -- sometimes in a landslide. And yet they did better than I did! When I first reviewed the tests, I got it wrong almost 75 percent of the time! As a marketer, I have always relied on my intuition. The very best marketing people with whom I've ever worked were those who had great instincts regarding the messaging that would produce the best results. So it was sobering to me -- and to the audience, I suspect -- to be so starkly reminded that email marketers who rely primarily on their intuition put their programs at risk.

It all comes down to those pesky consumers! When it comes to their inboxes, they rarely behave in a way we can predict. And to make things even more complicated, a tactic that wins a test this month can turn into the biggest loser next month. Using all caps in your subject line might have been the right move yesterday, but it could turn out to be a terrible idea the next time around.

This in its own way makes some sense. After all, if testing indicates that Tuesday at 10 a.m. EST is the best time to launch campaigns, it might suddenly become one of the worst times to do so when every company starts sending at the exact same time. And while my all-caps example might help you gain attention for a flash sale, if used too often, the tactic might lose its impact and become "invisible" to your subscribers. (When was the last time that you think any smoker noticed the warning label?)

In short, it's not just important to test; it's equally important to keep testing. The emails you send are subject to the influences of every single other email that lands in your subscriber's inbox.

So what should you be testing? Don't be afraid to test anything. I generally think of testing in terms of five buckets: content, targeting, timing, email design, and forms. Copy (an element of content) often makes the biggest difference. Relatively easy copy tweaks can show a surprising jump in ROI. But, as Pee Wee Herman put it in his Big Adventure, "Everyone I know has a big but." And I have one as well. It is important to test your campaign, but if you don't clearly establish the measure of success up front, you might be doing more harm than good.

Those of you who are kind enough to follow my column regularly know that I often rail against the practice of relying solely on opens and clicks as a measure of success. Nowhere does this hurt you more than in testing your campaigns.

One of the tests we reviewed at the conference was a subject simple subject line test. The campaign was designed to generate registrations for a webinar. Subject line A clearly generated more interest as measured by opens and clicks. If that was your measure of success, you'd declare victory and send that version to your entire list. However, the test also revealed that Subject line B, while attracting less overall interest, was more successful in engaging folks who were likely to register. In fact, it crushed the other subject line in that regard. So which subject line really won? Sure, it took a little longer to determine the winner, but it's the results that matter. There was another campaign we reviewed that tested two offers. The one with the most opens and clicks once again was not the campaign that led to the greatest number of transactions.

Many email platforms today make A/B testing simple and automated. The platform will send small batches to subscribers on your list and automatically revert to sending the version that generates more opens and clicks to the bulk of your list. Based on the examples I just discussed, that's not necessarily going to produce the optimal result unless all you care about are open and clicks. When it's all said and done, conducting a rigorous testing program absolutely increases the amount of work you need to put into your campaigns. But by eliminating -- or at least reducing -- the guesswork (intuition) you build into your email marketing, you are going to get better results. So while you will never know with 100 percent accuracy what is going to work best, a good testing program can get you much closer to generating the customer reactions you seek.

Chris Marriott is a data-driven digital marketing consultant.

On Twitter? Follow iMedia Connection at @iMediaTweet.

 

Comments