Focusing on impact, not ROI

“How do we get management to better appreciate the value of research?” Throughout my entire career, the research industry has discussed the challenge of how to increase the perceived value of research. Back in the late ‘90s when I was leading research for Coca-Cola in the Netherlands, a panel discussion with a group of European research directors asked this very question. I recall thinking that what they were really asking was, “How do we gain more appreciation for what we do for them?”

A review of conference topics over the years demonstrates the continued pervasiveness of discussions on value delivery improvement or ROI measurement. This ongoing discussion has been productive; it has helped the industry advance past providing data/information and insights (neither sufficient) and should help us get past storytelling (also not enough). But it has also, at times, been very unproductive as we have continued to rehash and pursue outdated ideas such as measuring the ROI of research at a project level.

Focusing on ROI misses the point

The desire to find a simple, quantitative way to value research keeps people searching down the well-trod ROI path but the reality is that this goal is a canard. The precise value of an insight and its influence on decision-making will always be complicated and subjective and cannot be meaningfully measured quantitatively.

The aforementioned 1990s panel concluded that researchers needed to do a better job of explaining and selling the value (ROI) of the research results to senior management because senior managers “just didn’t get it.” That view reflects a reality that as an industry we’ve been slow to accept, even today: we are not (yet) meeting management expectations. There still remains a strong undercurrent of belief that “if only management understood” and “the industry may not be doing well but MY team is nailing it.”

While at the time I was sure those panelists were wrong, I didn’t have adequate experience to articulate effectively what they were missing. Today, I know that if we have to explain it or prove it, then the value isn’t sufficiently clear. The problem is not with management, although they don’t understand how to get value out of research either. Rather, researchers need to make the value clearer by connecting their work more directly to decision-making.

If not ROI, then what?

It is my belief that researchers must focus on – and take personal responsibility for – one primary goal: to make an impact, every time.

Impact is not the same as ROI – impact is simple, obvious and measured in the eyes of the client. ROI is complex, requires application of metrics to a subjective topic and is determined typically by the provider. Also, impact is not always accomplished by simply answering questions that are asked of us. Time, place and context matter to both how and which questions must be asked and how the answers should be packaged.

So, we must shift our mind-set: we must accept that the value of research (or analysis of any type/scope) is based not on the quality of its methodology or even the quality of the insight generated. Instead, value is based directly on influence, i.e., the degree to which a project is perceived by the client and management (not the researcher) to affect their decision-making.

Embracing this new mind-set also requires us to recognize that, contrary to some assertions, management does not require precise measurement for the value of insights. Asking about ROI is management’s way of saying, “You are not delivering sufficient value or making an impact.” The effort to measure something subjective with precision is not just futile, it often backfires because the effort itself demonstrates a lack of understanding about how to make an impact.

And, we must accept that the receiver of the insight evaluates impact, not the producer/provider. Knowledge is power only when delivered to someone who is ready, able and willing to act on it. Additionally, value isn’t absolute, it depends on context and audience.

As a result, rather than defining (and limiting) our role to being expert information gatherers who deliver insights, to “get to the table” we must evolve our role to become consultants who aim to deliver a coherent insight in the form of a POV to the right person at the right time. Effectively gathering, organizing and distributing insights are table stakes; shifting focus to connecting decision makers to the specific insight that will influence decision-making requires understanding the detailed business context in which those decisions are being made.

That last bit is perhaps most important. Too often, researchers do not feel the need to pursue – and forget to ask for – an understanding of the business context underlying the project, e.g., who needs to be convinced? What are the political implications? The barriers to action? What has already been tried? As a result, researchers may well answer the research question they’ve been directly asked, e.g., “How many people do X?” but fail to provide the specific insights needed in the right order and context, or fail to provide the insights in time and for the right audience.

Despite answering the questions asked of us, researchers are still not perceived to be adding value to the degree expected by management. Despite having clear objectives, we fail to deliver impact.

This is not a new idea or even originally mine. I first learned this while in the Army from a Coast Guard Admiral who said to me after I finished my briefing on the week’s drug interdiction intelligence, “OK, Lieutenant, good briefing. Now what do you want me to do about it?” He made it very clear to me that the intelligence was only of value based on the actions it drove and that providing relevant recommendations was part of my job.

Similarly, Tom Long (briefly the leader of global research at Coca-Cola en route to roles as a division president and later CEO at MillerCoors LLC) once caused a bit of an uproar when he told the global research team that the purpose of a report is NOT to deliver insights. A report should SELL a POV about what the client should do based on the facts and data at hand. 

No client asks a question just because they are curious. They (or their boss) ALWAYS have something in mind. The key to delivering value is to understand that the premise that “insights lead automatically to action” is FALSE. To be perceived as adding value, insights must be actively connected to a recommendation/POV based on business context.

Barriers and benefits to change

There are many challenges to implementing this changed perspective. The most basic of these is that gaining clarity of purpose (understanding the business context) is often conflated with clarity of objectives or clarity of questions to be asked. We incorrectly think that we are already doing this!

Clarity of purpose means understanding how the information will affect decision-making THIS time, in the existing business context. Gaining clarity on purpose up front is too often undervalued by researchers and marketers alike. However, for researchers, remembering that the real goal is to get to the table, then instead of being nice to know, the questions we ask to gain clarity on business context are need to know. They serve to engage our clients in a more productive discussion and to reposition us as partners rather than information gatherers. They tell us how to make a difference.

goldfishes jumping from one bowl to the other on white background - Goldfische springen von einem Glas ins andere vor weissem Hintergrund

Each time we fail to take the step of seeking clarity of purpose/context, even on simple, tactical projects, we degrade the perceptions of us within the client organization. But gaining clarity of context isn’t always easy; there are definite obstacles and it isn’t as easy as simply asking, “What are you going to do with this information?” because:

  • Our clients don’t always know (or haven’t yet thought through the next steps). It is common that the person who comes to ask a question may not, themselves, fully understand the context behind it.
  • We CAN often get away with less, particularly for tactical studies and issues.
  • Being effectively assertive – something called “red team” thinking – is not easy. Challenging someone’s questions, surfacing assumptions and being a good devil’s advocate requires experience and tact, especially in the initial stages of a client relationship.
  • Researchers are not marketers, zebras are not horses. A research director (and friend) was told by industry leadership that he needed to hire separate people for methodology vs. consulting roles. They (incorrectly, in my opinion) believed that one person could not do both.

Additionally, making this change requires not just a change in how we approach our task but in how we measure ourselves. It is no small shift to change from measuring people on project management skills and quality of insights delivered to effectiveness of ability to influence others and make an impact.

My own efforts to change the approach of people who have worked with/for me over the years has made it clear that this is not a simple adjustment. And two separate conversations I had recently on this topic with leaders in Fortune 100 Insights organizations reinforce that there can be internal resistance and pushback on how to define the term and the responsibilities.

But taking this approach and seeking to understanding business context is the hidden solution to most of the common problems we experience, from lack of client engagement to how best to ask questions, narrow the focus of an analysis or shorten a survey. When something goes wrong, another benefit of having this knowledge is that we can approach clients with reasoned recommendations rather than just options/choices.

Moving forward: Accept, adopt and adapt

Based on recent conversations on this topic at conferences and discussions with clients, I’ve concluded that mind-sets are shifting in the direction of impact, but slowly. We are using the language today but what still seems to be missing is the recognition of the need to take personal responsibility for gaining clarity on the desired impact and a clear vision on HOW to do it.

I believe that instead of ROI, insights and stories, the next step for our industry is to focus on impact and influence, taking personal responsibility for the perceived value of the work we do.

This is not an easy pivot for our industry. To do this effectively, first we must accept the fact that insights and stories are not enough to deliver value. Second, we must adopt the role of consultant – clients long ago gave us permission to become assertive, constructive, consultative partners rather than just data gatherers and insights providers. To become a proactive, consultative partner, we have to be as good at pursuing an understanding of the business context as understanding the questions to be answered. No amount of customizing insights and storytelling can ensure value delivery without it.

Finally, we must adapt reports from providing “an answer” to providing a POV that fits the client’s business context and is customized to the audience. This doesn’t mean that we get to TELL our clients what to do; our job remains to influence but that includes providing objective POVs based on what we know of the options available to our clients.

Ultimately, embracing this viewpoint means recognizing that today’s researchers need to know more than just how to create insights and disseminate them – we need to be able to influence the organization to use those insights. That is management’s expectation. Insights delivered in this manner will get us to the table, provide clear value and inspire client confidence.


Bruce Olson, Managing Partner, MMR Research Associates