4.0 Telling stories with data
“What story are we trying to tell with this data?” I’ve been asked – and have asked – this question many times. I’ve seen examples where the answer is ‘we need to show that x campaign performed to position this next project’. And I’ve seen examples where the answer is ‘we need to prove the value of doing y’. But why does data need to tell a story rather than just being presented? Doesn’t creating a story from data not dilute, obfuscate or distract from the underlying truth? Well, maybe. But that doesn’t mean it shouldn’t still have a story. Data needs to tell a story because stories are how we remember. A single statistic can tell a story. That story – if it’s simple, unexpected and tangible – will be resonant and take on a life of its own. It will make people think, act or feel based upon it – assuming it elicits and emotion and they care about it. But where one insight can tell a story, ten statistics can just create noise. A PowerPoint deck with slides of graphs that build towards nothing, that fail to connect to each other or that no one cares about will fail in its ultimate purpose: to drive the right action. Data needs to tell a story, but with that story comes the risk that personal interest takes over from truth.
This is the shortest section of this site. It’s also the one that I wish was the longest. Why? Because data – specifically, a lack of real understanding of it – is what causes marketer’s love of alchemy to persist. You may have heard the story of the statistician that drowned crossing a river. He had calculated that the average depth was three feet, only to find out that an average was not the best value to use when his life depended upon it. This story is a fiction, but it illustrates the point. Data can be misleading and how you look at data needs to be informed with an understanding of why you’re looking at it that way. If you’re trying to work out the volume of water flowing through a river then the average depth may be fine to base your calculations on. If you’re working out whether you can safely cross, looking at the maximum depth might be a better measure. When cooking food at a family barbeque do you want to know the average time it takes to cook everything or the specific time it takes to cook each item? When have to be on time to an interview do you drive on the route that has a 30% chance of being 10% late or a 10% chance of 30% late and do you know the difference when you’re looking at options?
The examples from the world of science that you’ve read in this site are, as I’m sure you’ve realised, used more to illustrate a point than to prescribe a direction. They’re used to draw parallels and create a hazy mirror upon which to gaze at a business function that could use some self-reflection. None of this is to say that there aren’t highly data-minded people with Marketing, its simply a product of an environment where research and innovation almost always need to prove value straight out the gate. A new CMO walking into their first board meeting is unlikely to have a long tenure if their performance report says “well, of the $100m we invested last year we have 95% confidence that we created $14.7m of direct pipeline, 50% confidence we supported $69m of pipeline and 30% confidence we influenced $420m pipeline.”
Data needs to tell a story, be it to a prospective customer, an internal marketing decision maker or to a board member. “In the next 24 months, we will 1. grow our brand value by 25%, 2. create $500m of pipeline and 3. grow our NPS by 15%.” Or: “Today we’re known for X. Over the next 18 months we will expand into Y market and capture Z value.” These are closer to the level a board may expect a CMO to position it’s value. But data still underpins them. Data still adds concreteness and tangibility to what will be delivered. And in an environment where there are so many micro-KPIs this data actually adds clarity.
Recall from back in the introduction the idea of waves washing up a beach. Too often Marketing functions like to define success by the impact of one wave. On one rock. One campaign to one audience in one quarter. Look selectively at enough rocks and pick only the ones that moved up the beach and you’ll be able to create a great PowerPoint slide with results. But beyond the inefficiency drain of pointless slideware it means very little in the broader business context. Are your waves measuring the action of the sea (brand), the ripples of a pebble thrown in (a campaign) or of a large boat that happened to power past (a big brand investment)? From the beaches’ perspective it’s still water washing at it.
Maximising impact and minimising waste requires a mindset shift away from micro-KPIs and one-off examples to macro impacts against bigger objectives. In my experience some marketers spend half their week creating PowerPoint slides to report on what they’ve done – imagine the productivity gain alone from not doing that. Yes, to understand if your waves are carving a channel in weeks or millennia you need to record each wave, but looking at the one best performing ad or one best performing piece of content is navel gazing. Move the mean performance. Move the median. Tighten the spread. A Marketing function can be confident it is having impact when it operates like a water jet cutter slicing through marble. Exact impact at a granular level is not the way forward. Maximise waves. Cut out waste. Galvanise the organisation. Marketing needs to set a direction and maximise impact against that.
The scientific method uses data to qualify how confident you are in a conclusion. By contrast, the alchemic method aims to either justify a conclusion with data or find conclusions within data where they may not exist. There’s no mal-intent behind this; maybe its human nature? I doubt many people were ever promoted for saying “on the balance of probability the campaign impact trended towards zero.” No, If it’s in your interest to show your work has driven great results, if everyone around you is showing their work driving great value, and if no one is questioning you, why wouldn’t you? If the wrong behaviour is reinforced it becomes normalised. It becomes your culture. But it shouldn’t be.
It is upon leaders and leadership to understand the data, to look at the longer term and to avoid incentivising short-term vanity metrics when longer-term real value creation is the objective. Here’s the conclusion I hope you take away from this chapter: treat information with great suspicion until you know how it was collected, how it was analysed, and, most importantly, how the person that collected it plans to use it.