Judging probability by how much something resembles a typical case while ignoring base rates, sample size, and statistical principles.
When people assess probability using representativeness, they focus on how well something matches a stereotype or prototype, neglecting crucial statistical information. This produces the conjunction fallacy (thinking 'Linda is a bank teller and feminist' is more probable than 'Linda is a bank teller'), insensitivity to sample size (treating small samples as representative as large ones), and misconceptions about randomness (expecting random sequences to 'look random' without streaks). The heuristic explains why people choose 'librarian' over 'farmer' for shy, orderly Steve, despite farmers vastly outnumbering librarians.
Investors chase mutual funds with recent strong performance, assuming the fund manager has skill (representativeness of 'good performance = skilled manager') while ignoring regression to the mean and the base rate that most active managers underperform.
If something looks like a typical example, it's probably that thing—actually, you must also consider base rates (how common is each category) before making probability judgments.
Why do people judge that 'shy, orderly Steve' is more likely to be a librarian than a farmer, even though this judgment is probably wrong?
How does the representativeness heuristic explain the conjunction fallacy in the Linda problem?
The slow, deliberate, effortful mode of thinking that allocates attention to complex computations, self-control, and conscious reasoning.
Mental ModelThe fast, automatic, intuitive mode of thinking that operates effortlessly and generates impressions, intuitions, and feelings without conscious control.
Mental ModelJudging the frequency or probability of events by how easily examples come to mind, leading to overestimation of vivid or recent events.
Mental ModelThe tendency to rely too heavily on an initial piece of information (the anchor) when making subsequent judgments, even when the anchor is arbitrary or irrelevant.
Mental ModelThe principle that losses loom psychologically larger than equivalent gains, with losing something feeling roughly twice as bad as gaining the same thing feels good.
PrincipleA descriptive model of decision-making under risk showing that people evaluate outcomes relative to a reference point, are loss-averse, and weight probabilities non-linearly.
FrameworkSystem 1's tendency to construct the most coherent story possible from currently available information without considering what's missing or questions not asked.
PrincipleThe systematic tendency to underestimate how long tasks will take, how much they'll cost, and what risks they face, due to focusing on the specific plan rather than similar projects.
PrincipleJudging probability by how much something resembles a typical case while ignoring base rates, sample size, and statistical principles.
Investors chase mutual funds with recent strong performance, assuming the fund manager has skill (representativeness of 'good performance = skilled manager') while ignoring regression to the mean and the base rate that most active managers underperform.
If something looks like a typical example, it's probably that thing—actually, you must also consider base rates (how common is each category) before making probability judgments.
Representativeness Heuristic is explored in depth in "Thinking, Fast and Slow" by Daniel Kahneman. Distilo provides a deep AI-powered analysis with key insights, audio narration, and practical frameworks.