The systematic tendency to underestimate how long tasks will take, how much they'll cost, and what risks they face, due to focusing on the specific plan rather than similar projects.
The planning fallacy reflects optimistic bias and the inside view—we focus on our specific plan and imagine best-case scenarios while failing to consider the many ways things could go wrong. We don't naturally consult base rates showing how long similar projects actually took. This affects individuals (students predicting thesis completion), organizations (construction projects), and governments (infrastructure budgets). It's exacerbated by motivated reasoning—we want projects to succeed, so we convince ourselves they will. The solution is reference class forecasting: comparing your situation to similar past cases rather than treating it as unique.
Kahneman's own curriculum project: the team estimated 18-30 months for completion. When asked about similar projects, the expert revealed 40% were never completed and none finished in less than seven years. Despite this devastating base rate, the team continued—and took eight years.
My project is different from others, so base rates don't apply—actually, everyone thinks their project is special; the planning fallacy persists even when you know about it.
Why does the planning fallacy persist even when people know about it and have experienced it in their own past projects?
In Kahneman's curriculum project, the team estimated 18-30 months, but the expert revealed similar projects took 7+ years with 40% never completing. Why did the team continue despite this devastating base rate?
The slow, deliberate, effortful mode of thinking that allocates attention to complex computations, self-control, and conscious reasoning.
Mental ModelThe fast, automatic, intuitive mode of thinking that operates effortlessly and generates impressions, intuitions, and feelings without conscious control.
Mental ModelJudging the frequency or probability of events by how easily examples come to mind, leading to overestimation of vivid or recent events.
Mental ModelJudging probability by how much something resembles a typical case while ignoring base rates, sample size, and statistical principles.
Mental ModelThe tendency to rely too heavily on an initial piece of information (the anchor) when making subsequent judgments, even when the anchor is arbitrary or irrelevant.
Mental ModelThe principle that losses loom psychologically larger than equivalent gains, with losing something feeling roughly twice as bad as gaining the same thing feels good.
PrincipleA descriptive model of decision-making under risk showing that people evaluate outcomes relative to a reference point, are loss-averse, and weight probabilities non-linearly.
FrameworkSystem 1's tendency to construct the most coherent story possible from currently available information without considering what's missing or questions not asked.
PrincipleThe systematic tendency to underestimate how long tasks will take, how much they'll cost, and what risks they face, due to focusing on the specific plan rather than similar projects.
Kahneman's own curriculum project: the team estimated 18-30 months for completion. When asked about similar projects, the expert revealed 40% were never completed and none finished in less than seven years. Despite this devastating base rate, the team continued—and took eight years.
My project is different from others, so base rates don't apply—actually, everyone thinks their project is special; the planning fallacy persists even when you know about it.
Planning Fallacy is explored in depth in "Thinking, Fast and Slow" by Daniel Kahneman. Distilo provides a deep AI-powered analysis with key insights, audio narration, and practical frameworks.