Last month we discussed how Colin Powell’s 40–70 rule applies to research. What we didn’t say is that results are only half the story and should not be taken as law-like truth.
Many research programs with obvious results fail when applied to marketing. Why didn’t they work? There was an ’80s rumor that went something like this:
Sony held a series of focus groups to validate new color lines for its popular “boom box.” After countless hours of color ideas from consumers, the participants were offered a free boom box, in the color they wanted. Almost everyone chose black.
The moral of the story: When you ask for somebody’s opinion, they’ll give you one based on how they think. But there’s a significant difference between how someone thinks and how he or she actually behaves. Your real results are based upon who responds, what they’re asked and current industry trends.
Let’s examine the research areas with potential for error:
If you want opinions about your brand innovation, they should be from people with greatest likelihood (or history) of using it. If you’re looking for input from hospitals on a technology offering, do you include C-level executives? Yes, if they’re involved in the product’s use, and no, if you’re selling rubber gloves.
In the decades I’ve been involved in research, I’ve seen some glaring errors repeated often. One of the worst: asking people point blank about a product and its value. Why is that a problem? Experienced researchers will tell you to take an anthropologic approach, in which the interviewee tells you about an experience. From there, you should extract the points that you’re seeking.
You want to know how the industry perceives your brand, so you conduct a survey. If you only interview customers, however, you get their perceived value of the product they use, plus their feelings about the salesperson they dealt with, the customer support rep, and the collective value of their interactions with your organization. To get the big picture, you should survey the largest component of the market: your prospects.
Do you expect research to answer all of your questions? If you don’t design it to achieve a specific objective, it won’t answer any of them adequately. You want to know how your brand compares to others? Why? It’s not a guarantee that a more popular brand has greater sales—just ask your local cable provider.
A better question might be: “What area of the market is currently underserved in the buyer’s mind?” That will tell you which product strengths buyers are looking for and reveal pricing area vacuums (perhaps the market lacks a perceived value leader that balances the necessary functions with a fair price).
Less is More
When it’s all over, summarize the data, have a logical story, make prescriptive recommendations that summarize the implications briefly and succinctly. Research can make marketing far more effective if you take steps to ensure you can trust what you see.
Next month we’ll be talking about analytics and their impact.
Dan Hansen is a Senior Partner with Red House and a 30-year veteran of the marketing industry. In addition to holding a master’s degree in advertising from Syracuse University, he works in a marketing consulting capacity with Red House clients such as McKesson, Elsevier, Equifax, and AT&T.