Mindset & Psychology

Thinking, Fast and Slow

Daniel Kahneman·2011
Thinking, Fast and Slow cover

The Nobel-winning tour of the two minds inside your head.

Swipe up · 7 highlights
Summary·Thinking, Fast and Slow

The big idea

Kahneman summarizes a lifetime of research with Amos Tversky into the architecture of human judgment. System 1 is fast, intuitive, automatic; System 2 is slow, effortful, analytical. Most of our errors come from System 1 running unchecked — anchoring, availability, framing, loss aversion. The book is dense, sometimes academic, but is the canonical reference for how human cognition actually works versus how we imagine it does.

Page 1 of 1 · hold to pause
Highlight 1·Mindset & thinking

System 1 is fast and intuitive; System 2 is slow and analytical — both are needed.

Daniel Kahneman opens with two prompts. Read 2+2 — the answer 4 appears before you finish the sentence, automatically, costlessly. Now compute 17 × 24. Pupils dilate, heart rate climbs, you stop walking if you were walking. The first is System 1, fast and effortless. The second is System 2, slow and effortful. The whole book is a tour of where System 1 produces errors System 2 fails to catch.

Page 1 of 5 · hold to pause
Highlight 2·Systems & frameworks

Anchoring: the first number you see massively biases the next number you produce.

Kahneman and Tversky ran a now-classic experiment at the University of Oregon. Subjects watched a wheel of fortune — secretly rigged to land on either 10 or 65 — then were asked what percentage of UN member states are African. Those who saw 10 guessed 25%. Those who saw 65 guessed 45%. A visibly random and obviously irrelevant number shifted answers by 20 percentage points.

Page 1 of 5 · hold to pause
Highlight 3·Resilience & protection

Loss aversion: losses hurt about twice as much as equivalent gains feel good.

Kahneman and Tversky published prospect theory in Econometrica in 1979. The core experiment: subjects offered a coin flip — heads you win $200, tails you lose $100 — overwhelmingly refused, even though the expected value is +$50. The pain of the $100 loss is roughly twice as intense as the pleasure of the $200 gain. They called the ratio the loss-aversion coefficient: about 2.0 across most populations.

Page 1 of 5 · hold to pause
Highlight 4·Mindset & thinking

Availability heuristic: vivid recent examples feel more probable than they are.

Paul Slovic and colleagues at the University of Oregon asked subjects to estimate the relative frequency of various causes of death. Tornadoes were judged more frequent killers than asthma — though asthma kills roughly twenty times as many Americans annually. The difference: tornadoes lead the news; asthma deaths happen quietly in apartments. Vividness drove the estimate, not data.

Page 1 of 5 · hold to pause
Highlight 5·Mindset & thinking

Framing changes decisions: 90% survival sounds different than 10% mortality.

Barbara McNeil at Harvard Medical School ran a 1982 study asking experienced physicians to choose between surgery and radiation for lung cancer. Group A read survival rates: '90% of patients survive the operation.' Group B read identical data framed as mortality: '10% die during the operation.' The decision-relevant facts were identical. Surgery preference dropped from 84% in the survival frame to 50% in the mortality frame.

Page 1 of 5 · hold to pause
Highlight 6·Reflection & awareness

Confidence is a feeling, not a guarantee of accuracy — overconfidence is the default.

Kahneman tells of his own youth in 1955 designing a soldier-evaluation test for the Israeli Defense Forces. He felt total confidence in its validity. Months later, the follow-up data showed his test was barely better than chance at predicting officer performance. He kept feeling sure even after he knew he shouldn't be. He called the experience the discovery of 'the illusion of validity' — and admits he kept making the same error for years afterward.

Page 1 of 5 · hold to pause
Highlight 7·Reflection & awareness

Premortems: imagine the project failed — why? — to surface risks System 1 hides.

Gary Klein, a Naturalistic Decision Making researcher who'd spent decades studying firefighters and military commanders, proposed the premortem in a 2007 Harvard Business Review piece. The technique: before launching a project, gather the team and ask everyone to write a paragraph imagining the project failed badly two years from now, and explaining why. Kahneman immediately adopted it.

Page 1 of 5 · hold to pause