The ‘data-driven’ mindset feeds our dangerous craving for certainty

Exploring the principles of a decision-driven approach

Kyle Byrd
UX Collective

--

An abstract image showing a brain with a graph behind it
Source: Midjourney

In a previous post, we introduced a conversation that isn’t new (mentioned as early as 2010) but, it’s coming back with force — decision-driven vs. data-driven decision making.

Within the last couple of years (led by Cassie Kozyrkov, Lorien Pratt, Judea Pearl, and others), this ‘data science + decision science’ perspective has surfaced from the decision intelligence community — with roots in AI/ML (not frustrated ‘business people’).

But the argument isn’t an either/or ultimatum, it’s cautioning that there’s danger in starting with data as the focal point — especially without interrogation or any defined purpose.

“Data-driven decision-making gets people into trouble for two reasons — we tend to put data on a pedestal, but then fail to think critically about how the data was generated and jump to conclusions … Problem two is that we’re asking the wrong questions.”

Stefano Puntoni, Decisions, not data, should drive analytics programs

The pursuit of information is often in service of confirmation bias, not learning or understanding — but this isn’t done maliciously. It’s a natural, self-reinforcing tendency to confirm our existing beliefs and feel certain in our decisions.

Certainty is an emotional state that we crave — the alternative, of course, is uncertainty and ambiguity, which is viscerally uncomfortable to stomach.

“Despite how certainty feels, it is neither a conscious choice nor even a thought process. Certainty and similar states of “knowing what we know” arise out of involuntary brain mechanisms that, like love or anger, function independently of reason.”

Dr. Robert A. Burton MD, On Being Certain

A self-reinforcing cocktail of cognitive biases provides us with this sense of certainty:

  • Confirmation Bias: We tend to search for, interpret, and remember information in a way that confirms our pre-existing beliefs and ignores contradictory evidence. This bias can make us more susceptible to believing false information if it aligns with what we already think.
  • Ambiguity aversion: If a piece of false information provides a sense of certainty or closure, there’s a chance that the brain might prefer or accept it over a more ambiguous, albeit accurate, piece of information.
  • Need for cognitive closure: When faced with an event or outcome, our brains seek causes. This can lead to the invention or acceptance of false causes if no clear, true cause is evident.

“Our default is to believe that what we hear and read is true. Even when that information is clearly presented as being false, we are still likely to process it as true.”

Annie Duke, Thinking in Bets

  • Simplicity bias: accurate information is more complex than false or oversimplified versions. The brain tends to prefer simpler stories or explanations because they require less cognitive effort to process.
  • Groupthink: Group consensus can provide a feeling of certainty and comfort, even if the consensus is based on misinformation (e.g. collective illusions)
  • Cognitive Dissonance: ‘Uncomfortable truths’ are difficult to accept, so we tend to discount them or explain them away.
  • Illusory Causation: Our tendency to perceive a causal relationship between two events that are actually unrelated, simply because they are frequently paired together or occur close in time. This leads people to infer cause-and-effect relationships where none exist.

The output of these mechanisms is the feeling of certainty — and that’s what they’re optimized for. Without them, we’d be cognitively overloaded and paralyzed.

But we need these cognitive mechanisms to drive ‘strong convictions, loosely held’ — this sweet spot between unfounded overconfidence (illusory certainty) and the paralyzing fear of uncertainty.

The idea that perfect information is out there is an empty pursuit — it’s chasing the illusion of certainty.

“You can’t know that things will turn out all right. The struggle for certainty is an intrinsically hopeless one — which means you have permission to stop engaging in it.”

Oliver Burkeman, Four Thousand Weeks

Towards a ‘Decision-driven’ mindset

A decision-driven mindset, proposed through ‘decision intelligence’, treats the decision as the focal point instead of data. It directly combats the streetlight effect — to start with the purpose, then shine a light instead of starting with the light and searching for purpose.‍

Two people stand under a lamp post. Everything is dark except for where the lamp is illuminating. The person standing up says to the other “Have you lost your keys?”. The other, who is crawling on the ground says, “Yeah, I lost them over there but the light is better here.”
Source: Sketchplanations

‍The means to measure shouldn’t drive our problem space — the problem space should drive our investment in what to measure.

As data calcifies our beliefs and assumptions, we’re more likely to interpret new information in ways that entrench us in these existing beliefs and assumptions — not challenge them.

“It is an odd fact that subjective certainty is inversely proportional to objective certainty. The less reason a man has to suppose himself in the right, the more vehemently he asserts that there is no doubt whatsoever that he is exactly right.”

Bertrand Russell, The Scientific Outlook (1931)

Illusory certainty may very well be the current epidemic of modern strategy — which is likely why decision intelligence, as a combination of modern data science and decision science, presents a compelling evolution of the data-driven mindset.

So is data important? Of course!

Hopefully, no one reads this post and thinks the argument is otherwise — it’s about the application, not the data itself.

As Douglas Hubbard explains in his book, How to Measure Anything, there are three reasons to care about measurement:

  • When it informs key decisions
  • When it has its own market value and could be sold to other parties for a profit
  • Entertainment, research, or to satisfy a curiosity

‘Proving we’re right’ and ‘supporting an argument’ didn’t make the list. Informing a decision is a pre-decision point activity even though post-decision justification is often masked as ‘informing a decision’ — often surfacing words like “because…”, “certain”, “says”, “know”, and “will”.

A decision-driven mindset leads with:

  • Questions, not answers
  • Opportunities, not solutions
  • Challenge, not consensus
  • Dialog, not arguments
  • ‘I don’t know…’, not ‘I know…’

“Whoever cannot seek the unforeseen sees nothing for the known way is an impasse.”

Heraclitus

Data is misused, miscommunicated, misinterpreted, or worse — manipulated. It seems in many organizations, data-driven has become a sort of dogma. It’s used as a filibuster to delay progress or to discredit dissenters by masking opinions as fact.

But data is not factual — it requires human interpretation and judgment.

“No one ever made a decision because of a number. They need a story.”

Daniel Kahneman

Data is often weaponized instead of applied. It feeds confirmation instead of curiosity and is used for convincing instead of suggesting or challenging.

Data is an ingredient, not the truth — it must be interrogated — but our desire to feel certain is stronger than any desire for understanding.

To paraphrase the economist Mervyn King:

Observations are of little value without understanding the process that gave rise to the observation.

This post was originally published on 🔮 The Uncertainty Project — a resource on tools and techniques for strategic decision making and navigating uncertainty.

--

--

I talk about decision making and dealing with uncertainty | Product & Design, ex-Atlassian | Founded theuncertaintyproject.org | dotwork.com