Is marketing research suffering from an identity crisis?

Is marketing research suffering from an identity crisis?

Through a special arrangement, presented here for discussion is an excerpt from a current article from the Joel Rubinson on the Marketing Research blog.

Marketing research has an identity crisis. So now we have the Insights Association (a merger of CASRO and the MRA) and a speaker at the annual MRS conference in the UK saying we should no longer use the label “market research.” Is the word “research” dead in favor of “insights”?

Dear researchers: the problem isn’t in the name — the problem is that we are losing relevance with regard to the key problems affecting this digital marketing age.

Does programmatic marketing really deliver higher ROI? How do we construct segments to target via ad serving rules for best performance? How can we best use mobile marketing? What media strategy works best to drive trial for a new brand? How can we target someone during their purchase journey and does it actually produce better results? We cannot address these (and other) important prediction questions with legacy insights tools.

And in frustration, marketing teams are turning elsewhere. A media client of mine told me, “The programmatic people tend to have disdain for ‘small data’ researchers, and the researchers tend not to understand programmatic concepts and tools.”

The key is that the marketing research function needs to expand its mission:

“Driving repeatable marketing success through predictive insights built on data, evidence, and analytics.”

Predictive insights are an idea plus a forecast. To deliver, we need to move closer to the intersection of data science and marketing research, and seek a new kind of insight — predictive rather than descriptive. Relating to shopper research, we need to move on from insights about how people shop or visually scan a shelf to predictive engines: “If I re-arrange the shelf, aisles, etc., or if I target a shopper with a beacon triggered offer, here is what is likely to happen to ring.”

The word “repeatable” is really important: predictive insights should be an annuity, producing improved performance each campaign, each brand, each year. And predictive insights are different from foresight (also needed) — they are built on analytics off of multiple data streams, not scenario planning. At the intersection, insights needs to be held accountable for improving marketing ROI, not just delivering ideas.

Call it insights or research — but if we stay with retrospective and descriptive insights, mostly rooted in surveys, we will lose relevance.

BrainTrust

"Calling everything 'insights' trivializes the concept, the difficulty generating insights and the value true insights can provide."

Camille P. Schuster, PhD.

President, Global Collaborations, Inc.


"I think it’s way past time for a marketing revolution ... Tools aren’t strategy."

Ryan Mathews

Founder, CEO, Black Monk Consulting


"Market research was to provide “answers” when I was a brand manager."

Ben Ball

Senior Vice President, Dechert-Hampe (retired)


Discussion Questions

DISCUSSION QUESTIONS: How can shopper marketing teams best leverage traditional and more advanced streams of intelligence? Is there a natural clash between devotees of data science and predictive analytics and those relying on survey-driven and observational insights?

Poll

16 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Dr. Stephen Needel
Active Member
6 years ago

Joel’s point is clear — our job is to provide direction based on evidence. If my goal as a researcher is to provide answers or direction to marketing, I need to make use of the most appropriate tool in my tool box. Sometimes it’s a focus group, sometimes it’s a survey, sometimes an experiment and sometimes a deep dive into Big Data. It’s not either/or, it’s both and more.

Camille P. Schuster, PhD.
Member
6 years ago

All the questions asked can not be answered without marketing research. The big problem is that people are sloppy with terms, especially the term “insights.” Market research generates data. Understanding the parameters around collecting data and the tools used for analysis, the data generated from marketing research yields information about what the data means. Insights can be generated by examining the information extracted from one piece of research along with everything else known about the situation and players involved. Insights, marketing research, data and information are all different. Calling everything “insights” trivializes the concept, the difficulty generating insights and the value true insights can provide. Expecting “insights” to mean all things in all situations means they have no value.

Herb Sorensen
6 years ago

It’s really quite simple: RESEARCH is the way you gain INSIGHTS. However, it can be a two-way street. The best way to do research begins with possibly even “anecdotal” observations that prompt tentative insights. This is the beginning of empiricism, the door to science. Then comes the sometimes-hard work of naming what is being observed, and counting observations across time and environments. That meets Lord Kelvin’s requirement: “If your knowledge cannot be expressed in numbers, it is of a meager and unsatisfactory sort.”

As numbers accrue, the mathematical relations appear, and now can be represented in formulas. Combining multiple formulas delivers a mathematical model. NOW you have INSIGHT!

Ryan Mathews
Trusted Member
6 years ago

I think it’s way past time for a marketing revolution. It’s a new age and we have new tools and yet — for reasons largely unknown even to ourselves — we seem to feel compelled to fight old battles over and over again.

Tools aren’t strategy. How can shopper marketing teams best leverage intelligence? By using the best, most appropriate tools on the problem at hand, not worrying about whether or not one size really does fit all. Is the clash between data science and predictive analytics and others using other tools “natural?” No, it’s artificial and contrived.

Charles Dimov
Member
6 years ago

Marketing has definitely been evolving faster than ever before. It isn’t just the age of digital marketing. It is digital, social, data-driven, analytics-focused, observational and predictive. Marketing today has to be a mash-up of all of the above. But at the end of the day, marketing isn’t an academic science. It is a tool to be used by business — retail and otherwise. The data — big and small — must be geared toward making choices and decisions on direction and simply has to be used predictively.

With today’s volume of information, client responses and varied information sources, marketers need to have a grasp on a variety of techniques and sources that can provide business direction and choice-points on what to do again, and what NOT to repeat.

gordon arnold
gordon arnold
6 years ago

The problem here is too many chiefs for a single goal. This problem is exacerbated by the use of old 20th century reporting methods being used to point to irrelevant information. There is a lot of money being spent for people and plans they insist on being wrong based on using the ways and means of yesteryear.

Lyle Bunn (Ph.D. Hon)
Lyle Bunn (Ph.D. Hon)
6 years ago

I am stuck on the fact that the business function of insights is buried under the weight of statistics being available to support any decision. Being a pragmatic person (math was my major), I am optimistic that customer experience disciplines are best positioned to be agents of change based on insights. The truth can set the best actions and ideas free, but the best data and analysis is useless if its appetite and intention are inadequate. Change management is at the core of business optimization.

Ben Ball
Member
6 years ago

Legacy research departments, techniques and companies are a little bit like legacy IT systems — they are always a hindrance to change. Social media and open architecture coding made legacy IT systems change, but only when the money moved. The same is true with research (and “legacy” anything — to be fair.)

Market research was to provide “answers” when I was a brand manager. General Mills had several very cleverly devised testing protocols that were designed to ensure that maverick young brand managers couldn’t run big brands like Betty Crocker and Cheerios off the rails. We could do anything we wanted to in development or creative — but it had to “beat current among current users” in our standard testing to be approved and implemented. Smart, eh?

When we figured out that research really gave “direction” rather than answers, some of the ruling authority cachet wore off. But it was still the standard for “real marketers.” And “real research companies” didn’t do that loosey-goosey in-store stuff or “shop-alongs” or any of that. No way. Matched panel testing or die!

That’s how our company came to design and manage our own research projects. We always said we didn’t go to research — research fell to us because traditional houses didn’t know the right questions to ask nor the techniques to use to learn that “shopper stuff.” Until the budgets began to move. Whoa! Did the big departments and research houses ever change then! Suddenly it became imperative to “introduce real research rigor” into the in-store/shopper insights work. It was really pretty funny to watch.

And now we go to “insights.” Well, at least we have the objective right finally.

Ron X
Ron X
6 years ago

Good marketing research is hard. Many analysts are unaware of methodological problems or take unwarranted shortcuts. Often the most useful information comes from exploratory research surprises (i.e., “data science”). Any “Big Data” findings need traditional research verification. Thinking “insights” should be reserved for fact-based strategic recommendations which may go beyond the backgrounds of research analysts who know the methods involved. Managers with a strategy focus usually have limited technique training. They should heed the advice of those trained in research design. Information from researchers can be turned it into insights but strategists should not do the research themselves. Data can only be stretched so far (e.g., what “insights” can a localized, non-random survey of 100 shoppers suggest?). A shopper marketing team needs to divide labor (researchers and strategists) and respect each member’s role. Over time, researchers probably can learn strategy easier than strategists can learn methodology. Conflicts between predictive analytics and survey/experimental methods tend to deal with method biases and costs to address a question. The best technique varies by question.

Julie Bernard
6 years ago

Our industry’s approach to data-driven insights must evolve beyond shelving choices and geo-aware push messages. While there is absolutely a place for “predictive insights” in brands’ and marketing’s strategy, the deep data-access and long-term consumer loyalty we seek lies in something more along the lines of “anticipatory inspiration.”

Meaning, the industry needs to look to its insights and analytics teams for more than just the ways behavioral patterns can lead to conversions in physical spaces; brands and marketers need to prompt and delight the audiences they’ve come to understand with ideas and products and experiences those shoppers haven’t thought of yet.

Millennials and Gen Z consumers, in particular, respond to this tactic. With billions of future dollars attached to these two demographics’ evolving decisions, it’s time for our data-based efforts to evolve as well.

Ralph Jacobson
Member
6 years ago

We in marketing are expert at wearing out words in general, with the examples given in the article. I do think, though, that in the past two years there have been game-changing advances in marketing analysis capabilities that we literally only dreamed of in the recent past.

Are surveys dead? Not chance. Scientific sample size surveys still provide great information that marketers can effectively leverage.

Further, there are some very new technologies around predictive modeling, real-time personalization, intelligent tagging, product sequencing and several others driven by true machine learning tools that are providing retailers around the globe real insights, yes true, actionable insights in the purest sense, as we speak.

Ricardo Belmar
Active Member
6 years ago

There is plenty of hype to dig through when it comes to analytics, data science, and marketing insights. “Insights” is becoming as misused a term as “omnichannel”! What’s getting missed by so many is a big picture view — not just of tactics fueled by data, but of overall customer-centricity as it encompasses all of your marketing tactics as a whole.

Having insights from your data is a measure of knowledge. How you choose to use that knowledge is what determines your success. The key for brands is not to lose sight of the customer. Without that as your focus, all the metrics in the world won’t tell you what you need to know to succeed.

Pavlo Khliust
6 years ago

To produce relevant data, researchers have to rely on multiple sources. There is no one size fits all strategy here. Individual approach in collecting data is crucial for the insight relevance. Predictive analytics plays an equally important role as observational insights do. It’s just a question of how to combine traditional and more advanced streams of intelligence — to what extent to use one or the other. Marketing teams have to understand/predict an outcome of requested research clearly, then select the most appropriate tools to achieve the highest probability of collecting the most objective possible data.

In short, yes, there is a clash between them to a certain degree.

Mark Price
Member
6 years ago

Shopper marketing today must combine the best of quantitative and qualitative methods to identify insight to change businesses. More important than the methods, however, are the questions.

Market research must ensure that the questions they are solving are current and meaningful. Current, as in reflecting the most recent advances in technology that are impacting consumers today. Meaningful, as in the answers to the questions have potential to shift the marketing effort and improve customer acquisition and engagement in measurable ways. Issues regarding point solutions are not anywhere as important as attribution analysis and a deep understanding of multichannel communications and the context/problem that each digital tool (and non-digital) solve for specific segments of consumers. To solve those questions, MR must use both qualitative and “hard” data tools.

Only by focusing in this way can market research avoid the risk of irrelevancy.

Scott Magids
6 years ago

The real relevance in market research and data analytics does not lie in the numbers. Those numbers may tell us what is happening, but they don’t tell us why. The real value lies in how close it brings us to understanding customers as human beings. The numbers might tell us for example, that people between the ages of 25 and 34 buy our products, but that information is of little use until we understand why they do buy – and why people outside of that age range don’t. Rather than predictive insights, advanced analytics should give us emotional insights – what are the emotional motivators that drive behavior? As the technology grows more sophisticated, the gap between analytics and observational insights is closing.

Doug Garnett
Active Member
6 years ago

“Insights” is a really very bad name because it what’s most important from market research: actionable findings. Of course, I also dislike the word “actionable” because it sounds bureaucratic.

Yet when we stop an remember that an “actionable finding” is something we learned that IF WE TAKE ACTION BASED ON IT either we save significant money by avoiding losses or make big money with actions that are more powerful based on what we learned.

Unfortunately, I’ve run into too much market research where the findings are NOT actionable. The problem seems to be a classic silo problem — where the silo confuses “well executed” with “discovered how to make our actions more profitable.”

What should be done? I think it remains organizational. The research teams need to learn how to do the research that gives us actionable findings and doesn’t just gather. Even more important, management teams need to get far, far better at taking actions based on what findings of research. (There’s research showing research success is most heavily dependent on whether executives know how to review it and use it.)

As to data … it’s just one more potential place to learn something. But the responsibility stays with US to run business by looking across what can be known and making human decisions that create success.