Binary Bias: Why Smart Leaders Oversimplify (and How It Quietly Breaks Decisions)
Most leadership mistakes don’t happen because people are careless: they happen because people are busy. Under pressure, with imperfect information, we reach for clarity. We want a clean answer. A decisive direction. A strong view. This is where binary bias can slip in.
Binary bias is the cognitive tendency to oversimplify complex information by treating continuous data as two opposing categories:
good / bad
success / failure
yes / no
right / wrong
us / them
Instead of seeing a spectrum, we default to dichotomous thinking. It can feel efficient, even reassuring, but it can often lead to distorted judgments because it ignores nuance and context.
Let’s be clear, binary thinking is not always wrong. In fact, sometimes it’s necessary. In knowledge work however, leadership and strategy, it is often a shortcut that silently damages decision quality and achievement of outcomes
Why leaders are especially vulnerable
Binary bias isn’t a “weak leader” problem: it’s a human cognition problem — amplified by leadership conditions. Research suggests leaders and senior managers fall into binary bias through a multi-layered system:
1) The brain likes easy categories
Fisher et al. (2018) showed that people naturally impose categorical distinctions on continuous data — compressing evidence into discrete “bins,” and letting the difference between those bins drive the final judgment. This showed up across domains like health, finance, and public policy. In other words: even when the world is continuous, we think in categories.
2) Bounded rationality forces shortcuts
Abatecola et al. (2018) highlights how decision-makers rely on heuristics when cognitive resources are limited:
availability (“what comes to mind first”)
representativeness (“this looks like something I’ve seen before”)
confirmation (“this supports what I already believe”)
affect (“this feels right”)
These shortcuts aren’t random — they interact and reinforce each other. So once binary thinking begins, there is positive reinforcement.
3) Some leaders actively prefer closure
Tetlock (2000) adds an uncomfortable layer: leaders with a strong need for cognitive closure often prefer simplicity. Not just because it’s easier — but because it reduces uncertainty. These leaders tend to favour:
clear recommendations over nuanced trade-offs
decisive messaging over open debate
“strong positions” over complexity
It can look like confidence, it can sound like leadership, but it may be categorical thinking disguised as conviction. (An interesting aside would be to look at differences between the sexes: an ‘assertive man’ versus a ‘bossy woman’, but I’ll hold that thought for another day).
4) Groups make it worse
Binary bias isn’t only individual — it’s social. Kamau et al. (2008) highlights groupthink dynamics:
conformity to dominant viewpoints
self-censorship
withholding contradictory information
Cross et al. (2001) adds the false-consensus effect: leaders overestimate agreement and undervalue alternative views. In groups, binary bias can become institutional.
What binary bias looks like in real organisations
Binary thinking doesn’t always appear as crude “black-and-white” language. It can often appear in subtle, professional forms:
Performance conversations
Guilfoyle (2015) showed how leaders often interpret performance using binary comparisons between isolated numbers, treating differences as meaningful without sufficient context: -
“They’re a high performer / low performer.”
“This team is doing well / not doing well.”
“The project is on track / off track.”
Strategy discussions
Binary framing creates false trade-offs, and could leave third (fourth, fifth….) options.
“We need to be more innovative.”
“We need more control.”
“We either centralise or decentralise.”
“We should focus on growth or risk.”
Risk decisions
Binary bias shows up strongly when decisions are:
complex
time-pressured
high-stakes
emotionally loaded
morally charged (“taboo trade-offs”)
That’s when leaders feel compelled to take a position, rather than explore a landscape.
The real cost: it doesn’t just reduce accuracy — it reduces adaptability
Binary bias is not just a thinking error, or cognitive bias: it’s a strategic constraint.
Across the literature, consequences include:
1) Degraded decision quality
Binary thinking causes leaders’ subjective assessments to drift away from objective outcomes (Ramachandran & Gopal, 2010). Not because leaders are irrational — but because they’re over-compressing reality.
2) Missed opportunities
When the world is treated as two categories, options in the middle disappear:
“promising but not ready”
“low risk but high uncertainty”
“good enough now, better later”
Binary thinking narrows the field.
3) Strategic inertia
Wright et al. (2004) describes how organisations become slow to adapt — trapped in dominant frames of reference. Binary bias accelerates this:
it simplifies complexity
it locks narratives into “what we are” vs “what we are not”
it makes alternative futures harder to imagine
4) Behavioural dysfunction
Binary cultures often trigger:
passing the buck (“not my problem”)
procrastination (“if we can’t solve it fully, we won’t start”)
risk aversion (“better to do nothing than be wrong”)
In extreme cases, binary thinking contributes to catastrophic failure — such as the compounding conditions seen in Northern Rock’s collapse.
A useful model: the “trigger threshold”
One of the most helpful insights from synthesising the evidence is this: binary thinking becomes dangerous when multiple triggers accumulate. As a baseline, we can assume that all leaders categorise. Categorising is hardwired, it’s normal, but binary bias intensifies when several pressures align:
information overload
time pressure
high stakes
conformist culture
isolation from outside influence
strong preference for closure
moral or identity threat (“we can’t be seen to compromise”)
A single trigger might not cause major distortion, but stack them together and binary thinking crosses a threshold — from “helpful simplification” to “strategic blindness.”
So what can leaders do about it?
The evidence suggests binary bias is self-reinforcing — so the solution can’t be one-dimensional. The most promising mitigation approaches combine structural controls with process discipline and individual awareness.
Here are practical interventions that map well to the research:
1) Build structural “anti-binary” friction
Cristofaro (2017) points to the value of quality control mechanisms and diverse stakeholders in complex initiatives.
In practice, this can look like:
pre-mortems
red teams / independent challenge
third-party review
checklists for high-stakes decisions
structured decision logs (“what did we believe at the time?”)
These mechanisms don’t eliminate bias — they reduce the ease of oversimplifying.
2) Shift leadership from outcome-directive to process-directive
Outcome focus can intensify binary thinking:
“Did we win or lose?”
“Did it work or not?”
Process-directive leadership forces a better question:
“Did we make the best decision we could, given uncertainty?”
That reframes learning, reduces fear, and keeps nuance alive.
3) Upgrade how you talk about performance
Instead of “good/bad,” use spectrum language:
“strong / emerging / inconsistent”
“stable / improving / declining”
“low confidence / medium confidence / high confidence”
“signal / noise / unclear”
The goal is not to sound academic but to keep reality continuous.
4) Train numerical literacy (quietly)
Fisher et al. (2018) found highly numerate individuals were less susceptible to binary bias. That doesn’t mean everyone needs to become a statistician, but leaders do need enough comfort with:
ranges
distributions
uncertainty
base rates
confidence levels
Binary bias thrives where numbers are treated as “proof” rather than “evidence.”
The leadership test isn’t decisiveness. It’s tolerance for nuance.
Binary bias is seductive because it feels like clarity. Bear in mind however that (1) what sometimes looks like clarity is just simplification; and (2) what looks like decisiveness is just discomfort with ambiguity.
The leaders who scale best — and sustain performance — aren’t the ones who always have a strong view. They’re the ones who can hold complexity long enough to make a better decision.
Because most of the time, the truth isn’t yes or no. It’s: ‘it depends’. And that’s the point.
Practical reflection questions (for your next decision)
Before you lock in a conclusion, ask:
Am I treating this as a switch when it’s really a spectrum?
What nuance am I compressing out of the story?
What would a “third option” look like?
What would make me change my mind?
What is the cost of being confidently wrong?
References
Fisher, M.A., & Keil, F. (2018). The Binary Bias: A Systematic Distortion in the Integration of Information. Psychological Science.
Tetlock, P. (2000). Cognitive Biases and Organizational Correctives.
Abatecola, G., Caputo, A., & Cristofaro, M. (2018). Reviewing cognitive distortions in managerial decision making. Journal of Management Development.
Wright, G., van der Heijden, K., Bradfield, R., Burt, G., & Cairns, G. (2004). The Psychology of Why Organizations Can be Slow to Adapt and Change.
Ramachandran, V., & Gopal, A. (2010). Managers' Judgments of Performance in IT Services Outsourcing. JMIS.
Guilfoyle, S. (2015). Binary Comparisons and Police Performance Measurement: Good or Bad?
Cristofaro, M. (2017). Reducing biases of decision-making processes in complex organizations.
Cross, R., & Brodt, S. (2001). How Assumptions of Consensus Undermine Decision Making.
Kamau, C., & Harorimana, D. (2008). Knowledge sharing and withholding in organizational committees.