A debiasing technique where you imagine a project has failed spectacularly in the future and work backward to explain what went wrong, surfacing risks that optimistic planning obscures.
Unlike a post-mortem (which analyzes failure after it occurs), a pre-mortem happens before a project begins. By assuming failure and generating explanations, you overcome optimistic bias and the planning fallacy. The technique forces System 2 engagement and surfaces risks that System 1's optimistic storytelling would miss. It's particularly effective in groups because it legitimizes dissent—team members who might hesitate to voice concerns in normal planning can freely identify risks in the pre-mortem context. This is one of Kahneman's most practical recommendations for improving organizational decision-making.
Before launching a new product, gather the team and say: 'It's one year from now and our product launch was a disaster. Write down why it failed.' Team members might identify risks like: 'We underestimated competitor response,' 'The technology wasn't ready,' 'We didn't have enough customer support staff.' These risks can then be addressed proactively.
Pre-mortems are pessimistic and demotivating—actually, they're realistic and protective, helping teams identify and mitigate risks before they become problems.
How does the pre-mortem technique relate to overcoming the planning fallacy and optimistic bias in group settings?
The slow, deliberate, effortful mode of thinking that allocates attention to complex computations, self-control, and conscious reasoning.
Mental ModelThe fast, automatic, intuitive mode of thinking that operates effortlessly and generates impressions, intuitions, and feelings without conscious control.
Mental ModelJudging the frequency or probability of events by how easily examples come to mind, leading to overestimation of vivid or recent events.
Mental ModelJudging probability by how much something resembles a typical case while ignoring base rates, sample size, and statistical principles.
Mental ModelThe tendency to rely too heavily on an initial piece of information (the anchor) when making subsequent judgments, even when the anchor is arbitrary or irrelevant.
Mental ModelThe principle that losses loom psychologically larger than equivalent gains, with losing something feeling roughly twice as bad as gaining the same thing feels good.
PrincipleA descriptive model of decision-making under risk showing that people evaluate outcomes relative to a reference point, are loss-averse, and weight probabilities non-linearly.
FrameworkSystem 1's tendency to construct the most coherent story possible from currently available information without considering what's missing or questions not asked.
PrincipleA debiasing technique where you imagine a project has failed spectacularly in the future and work backward to explain what went wrong, surfacing risks that optimistic planning obscures.
Before launching a new product, gather the team and say: 'It's one year from now and our product launch was a disaster. Write down why it failed.' Team members might identify risks like: 'We underestimated competitor response,' 'The technology wasn't ready,' 'We didn't have enough customer support staff.' These risks can then be addressed proactively.
Pre-mortems are pessimistic and demotivating—actually, they're realistic and protective, helping teams identify and mitigate risks before they become problems.
Pre-Mortem is explored in depth in "Thinking, Fast and Slow" by Daniel Kahneman. Distilo provides a deep AI-powered analysis with key insights, audio narration, and practical frameworks.