Check Availability

poor customer experience

Marketing is becoming a data-driven discipline and with good reason. But many companies are lagging behind this trend, putting performance and ROI at risk.

It used to be that marketing was looked at as a “soft skill,” whereas the finance people were the real quantitative minds. Nowhere was this misconception more prevalent than in Chicago, home to two of the top business schools in the world — the Kellogg School of Management at Northwestern University, known as a “marketing” school where the “creative” people go, and the Booth School of Business at the University of Chicago, known as a “finance” school where the “quant jocks” go.

As with most things, the truth lies somewhere in the middle. Both of those business schools churn out well-rounded business people, and marketing today requires both a quantitative and qualitative skill set.

But interestingly, it’s the creative part of marketing that still seems to lack the analytical rigor of other parts of the business. No one would consider running the Accounting department or Product Development or Customer Service without using data to guide decisions, but for some reason Marketing — and specifically its creative — is often immune.

Major brands are still relying on simplistic A/B testing or general market research vs. more advanced testing methodologies. This approach causes one or more key problems:

The decision of what to test is completely subjective.

Someone comes up with the idea to look at colors or paper stock or imagery, so marketers add those items to a never-ending list of A/B tests. This can result in confusion for prospects and customers due to the inconsistency of the branding and messaging.

A few years ago, one company tested replacing asterisks with footnote numbers leading to the fine print; the hypothesis was that people look as an asterisk and think “there’s a catch.” The footnote numbers outperformed by a double-digit percentage. Then they moved on to test another element.

It is often the “HiPPO” — the Highest Paid Person’s Opinion — that ends up driving the test plan, if they allow testing at all. Often, HiPPOs simply rely on their instinct and years of experience to make key creative decisions.

In contrast, more enlightened leaders are starting to respond with, “It doesn’t matter what I think. What matters is what the customer thinks.”

Market research only tells half the story.

But even what the customer thinks isn’t always the same as what the customer does, which is the core problem with market research. The Consumer Insights team at major companies is often buried in countless surveys, focus groups, diary studies and other customer research methodologies — both qualitative and quantitative.

No matter how many prospects or customers you talk to, though, it can never replace knowing what they actually do out in the wild. This is why election polling results are so often wrong. People might register as Republicans but vote Democratic (or vice versa). Likewise, consumers may say they like your new ad because they don’t want to offend or cause trouble. People may tell you they understand something when they really don’t, such as a recent study where three-quarters of Americans said they knew the definitions to four common healthcare terms, but only 4% correctly defined all four.

Plus, most prospects and customers are not designers, so asking them for their opinion on a piece of marketing creative is a little like asking a vegetarian for a steakhouse recommendation. Not only are they not experts, they are not necessarily representative of the larger population.

Simplistic testing doesn’t explain why something works.

I worked with a client once on a campaign test that used multivariate testing to look at several elements of a website logout page — the headline, the body copy, the image, and the call-to-action button. Internally, the client had been debating the image specifically – should it be lifestyle imagery or graphical, should it be posed or candid, etc. No doubt there was a big meeting at a long conference table where this was discussed and a HiPPO made an important decision.

The results of the test told a very different story. It turns out that the image didn’t really matter much when it came to getting the customer to take action. What did matter – a lot – were the three little words in the call-to-action button at the bottom of the page.

With advanced analytics tools like multivariate testing and experimental design, marketers can learn not just what works, but why it works, and which elements of a marketing campaign are the most important. This is both good news and bad news for HiPPOs (don’t worry, no HiPPOs were harmed in the writing of this article). The bad news is that opinions are taken out of the equation, so influence and authority are transferred from an executive to the data, which tells all. The good news? Marketing results will improve drastically because campaigns will be based on how customers actually behave.

Marketing is no longer just a “soft skill.” It requires a careful balancing of both sides of the brain and has transitioned from a subjective art to objective science. Decisions can now be made based on real, in-market data – not intuition – and on what consumers do, not just what they say. The result is a better experience for everyone.

Follow Dan on Twitter or LinkedIn.