The tendency to ignore statistical base rates (how common something is in the general population) when evaluating specific cases, focusing instead on case-specific information.
When people receive both base rate information (statistical frequencies) and case-specific information (details about an individual), they typically overweight the specific information and underweight or ignore the base rates. This is a manifestation of the representativeness heuristic—people judge probability by how well something matches a stereotype rather than by considering prior probabilities. Base rate neglect leads to systematic errors in diagnosis, prediction, and judgment across domains from medicine to law to business. The solution is to explicitly consider base rates before evaluating case-specific information, as in reference class forecasting.
If a disease affects 1% of the population and a test is 90% accurate, a positive test doesn't mean you have a 90% chance of having the disease. You must consider the base rate: among 1,000 people, 10 have the disease (9 test positive) and 990 don't (99 test positive). So a positive test means only 9/(9+99) = 8.3% chance of disease.
Specific information about this case is more relevant than general statistics—actually, base rates should be your starting point, and you should only deviate from them with strong case-specific evidence.
A disease affects 1% of the population. A test for the disease is 90% accurate (correctly identifies 90% of cases and correctly rules out 90% of non-cases). You test positive. Why is your probability of having the disease much lower than 90%?
How does the representativeness heuristic cause base rate neglect?
The slow, deliberate, effortful mode of thinking that allocates attention to complex computations, self-control, and conscious reasoning.
Mental ModelThe fast, automatic, intuitive mode of thinking that operates effortlessly and generates impressions, intuitions, and feelings without conscious control.
Mental ModelJudging the frequency or probability of events by how easily examples come to mind, leading to overestimation of vivid or recent events.
Mental ModelJudging probability by how much something resembles a typical case while ignoring base rates, sample size, and statistical principles.
Mental ModelThe tendency to rely too heavily on an initial piece of information (the anchor) when making subsequent judgments, even when the anchor is arbitrary or irrelevant.
Mental ModelThe principle that losses loom psychologically larger than equivalent gains, with losing something feeling roughly twice as bad as gaining the same thing feels good.
PrincipleA descriptive model of decision-making under risk showing that people evaluate outcomes relative to a reference point, are loss-averse, and weight probabilities non-linearly.
FrameworkSystem 1's tendency to construct the most coherent story possible from currently available information without considering what's missing or questions not asked.
PrincipleThe tendency to ignore statistical base rates (how common something is in the general population) when evaluating specific cases, focusing instead on case-specific information.
If a disease affects 1% of the population and a test is 90% accurate, a positive test doesn't mean you have a 90% chance of having the disease. You must consider the base rate: among 1,000 people, 10 have the disease (9 test positive) and 990 don't (99 test positive). So a positive test means only 9/(9+99) = 8.3% chance of disease.
Specific information about this case is more relevant than general statistics—actually, base rates should be your starting point, and you should only deviate from them with strong case-specific evidence.
Base Rate Neglect is explored in depth in "Thinking, Fast and Slow" by Daniel Kahneman. Distilo provides a deep AI-powered analysis with key insights, audio narration, and practical frameworks.