This is a preview of the Shortform book summary of Thinking, Fast and Slow by Daniel Kahneman.
Read Full Summary

1-Page Summary1-Page Book Summary of Thinking, Fast and Slow

Thinking, Fast and Slow concerns a few major questions: how do we make decisions? And in what ways do we make decisions poorly?

The book covers three areas of Daniel Kahneman’s research: cognitive biases, prospect theory, and happiness.

System 1 and 2

Kahneman defines two systems of the mind.

System 1: operates automatically and quickly, with little or no effort, and no sense of voluntary control

  • Examples: Detect that one object is farther than another; detect sadness in a voice; read words on billboards; understand simple sentences; drive a car on an empty road.

System 2: allocates attention to the effortful mental activities that demand it, including complex computations. Often associated with the subjective experience of agency, choice and concentration

  • Examples: Focus attention on a particular person in a crowd; exercise faster than is normal for you; monitor your behavior in a social situation; park in a narrow space; multiply 17 x 24.

System 1 automatically generates suggestions, feelings, and intuitions for System 2. If endorsed by System 2, intuitions turn into beliefs, and impulses turn into voluntary actions.

System 1 can be completely involuntary. You can’t stop your brain from completing 2 + 2 = ?, or from considering a cheesecake as delicious. You can’t unsee optical illusions, even if you rationally know what’s going on.

A lazy System 2 accepts what the faulty System 1 gives it, without questioning. This leads to cognitive biases. Even worse, cognitive strain taxes System 2, making it more willing to accept System 1. Therefore, we’re more vulnerable to cognitive biases when we’re stressed.

Because System 1 operates automatically and can’t be turned off, biases are difficult to prevent. Yet it’s also not wise (or energetically possible) to constantly question System 1, and System 2 is too slow to substitute in routine decisions. We should aim for a compromise: recognize situations when we’re vulnerable to mistakes, and avoid large mistakes when the stakes are high.

Cognitive Biases and Heuristics

Despite all the complexities of life, notice that you’re rarely stumped. You rarely face situations as mentally taxing as having to solve 9382 x 7491 in your head.

Isn’t it profound how we can make decisions without realizing it? You like or dislike people before you know much about them; you feel a company will succeed or fail without really analyzing it.

When faced with a difficult question, System 1 substitutes an easier question, or the heuristic question. The answer is often adequate, though imperfect.

Consider the following examples of heuristics:

  • Target question: Is this company’s stock worth buying? Will the price increase or decrease?
    • Heuristic question: How much do I like this company?
  • Target question: How happy are you with your life?
    • Heuristic question: What’s my current mood?
  • Target question: How far will this political candidate get in her party?
    • Heuristic question: Does this person look like a political winner?

These are related, but imperfect questions. When System 1 produces an imperfect answer, System 2 has the opportunity to reject this answer, but a lazy System 2 often endorses the heuristic without much scrutiny.

Important Biases and Heuristics

Confirmation bias: We tend to find and interpret information in a way that confirms our prior beliefs. We selectively pay attention to data that fit our prior beliefs and discard data that don’t.

“What you see is all there is”: We don’t consider the global set of alternatives or data. We don’t realize the data that are missing. Related:

  • Planning fallacy: we habitually underestimate the amount of time a project will take. This is because we ignore the many ways things could go wrong and visualize an ideal world where nothing goes wrong.
  • Sunk cost fallacy: we separate life into separate accounts, instead of considering the global account. For example, if you narrowly focus on a single failed project, you feel reluctant to cut your losses, but a broader view would show that you should cut your losses and put your resources elsewhere.

Ignoring reversion to the mean: If randomness is a major factor in outcomes, high performers today will suffer and low performers will improve, for no meaningful reason. Yet pundits will create superficial causal relationships to explain these random fluctuations in success and failure, observing that high performers buckled under the spotlight, or that low performers lit a fire of motivation.

Anchoring: When shown an initial piece of information, you bias toward that information, even if it’s irrelevant to the decision at hand. For instance, in one study, when a nonprofit requested $400, the average donation was $143; when it requested $5, the average donation was $20. The first piece of information (in this case, the suggested donation) influences our decision (in this case, how much to donate), even though the suggested amount shouldn’t be relevant to deciding how much to give.

Representativeness: You tend to use your stereotypes to make decisions, even when they contradict common sense statistics. For example, if you’re told about someone who is meek and keeps to himself, you’d guess the person is more likely to be a librarian than a construction worker, even though there are far more of the latter than the former in the country.

Availability bias: Vivid images and stronger emotions make items easier...

Want to learn the ideas in Thinking, Fast and Slow better than ever?

Unlock the full book summary of Thinking, Fast and Slow by signing up for Shortform .

Shortform summaries help you learn 10x better by:

  • Being 100% clear and logical: you learn complicated ideas, explained simply
  • Adding original insights and analysis,expanding on the book
  • Interactive exercises: apply the book's ideas to your own life with our educators' guidance.

READ FULL SUMMARY OF THINKING, FAST AND SLOW

Here's a preview of the rest of Shortform's Thinking, Fast and Slow summary:

Thinking, Fast and Slow Summary Part 1-1: Two Systems of Thinking

We believe we’re being rational most of the time, but really much of our thinking is automatic, done subconsciously by instinct. Most impressions arise without your knowing how they got there. Can you pinpoint exactly how you knew a man was angry from his facial expression, or how you could tell that one object was farther away than another, or why you laughed at a funny joke?

This becomes more practically important for the decisions we make. Often, we’ve decided what we’re going to do before we even realize it. Only after this subconscious decision does our rational mind try to justify it.

The brain does this to save on effort, substituting easier questions for harder questions. Instead of thinking, “should I invest in Tesla stock? Is it priced correctly?” you might instead think, “do I like Tesla cars?” The insidious part is, you often don’t notice the substitution. This type of substitution produces systematic errors, also called biases. We are blind to our blindness.

System 1 and System 2 Thinking

In Thinking, Fast and Slow, Kahneman defines two systems of the mind:

System 1: operates automatically and quickly, with little or no effort, and no...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 1-2: System 2 Has a Maximum Capacity

System 2 thinking has a limited budget of attention - you can only do so many cognitively difficult things at once.

This limitation is true when doing two tasks at the same time - if you’re navigating traffic on a busy highway, it becomes far harder to solve a multiplication problem.

This limitation is also true when one task comes after another - depleting System 2 resources earlier in the day can lower inhibitions later. For example, a hard day at work will make you more susceptible to impulsive buying from late-night infomercials. This is also known as “ego depletion,” or the idea that you have a limited pool of willpower or mental resources that can be depleted each day.

All forms of voluntary effort - cognitive, emotional, physical - seem to draw at least partly on a shared pool of mental energy.

  • Stifling emotions during a sad film worsens physical stamina later.
  • Memorizing a list of seven digits makes subjects more likely to yield to more decadent desserts.

Differences in Demanding Tasks

The law of least effort states that **“if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of...

What Our Readers Say

This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 1-3: System 1 is Associative

Think of your brain as a vast network of ideas connected to each other. These ideas can be concrete or abstract. The ideas can involve memories, emotions, and physical sensations.

When one node in the network is activated, say by seeing a word or image, it automatically activates its surrounding nodes, rippling outward like a pebble thrown in water.

As an example, consider the following two words:

“Bananas Vomit”

Suddenly, within a second, reading those two words may have triggered a host of different ideas. You might have pictured yellow fruits; felt a physiological aversion in the pit of your stomach; remembered the last time you vomited; thought about other diseases - all done automatically without your conscious control.

The evocations can be self-reinforcing - a word evokes memories, which evoke emotions, which evoke facial expressions, which evoke other reactions, and which reinforce other ideas.

Links between ideas consist of several forms:

  • Cause → Effect
  • Belonging to the Same Category (lemon → fruit)
  • Things to their properties (lemon → yellow, sour)

Association is Fast and Subconscious

In the next exercise, you’ll be shown three words....

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 1-4: How We Make Judgments

System 1 continuously monitors what’s going on outside and inside the mind and generates assessments with little effort and without intention. The basic assessments include language, facial recognition, social hierarchy, similarity, causality, associations, and exemplars.

  • In this way, you can look at a male face and consider him competent (for instance, if he has a strong chin and a slight confident smile).
  • The survival purpose is to monitor surroundings for threats.

However, not every attribute of the situation is measured. System 1 is much better at determining comparisons between things and the average of things, not the sum of things. Here’s an example:

In the below picture, try to quickly determine what the average length of the lines is.

thinking-fast-and-slow-lines.png

Now try to determine the sum of the length of the lines. This is less intuitive and requires System 2.

Unlike System 2 thinking, these basic assessments of System 1 are not impaired when the observer is cognitively busy.

In addition to basic assessments: System 1 also has two other...

Why people love using Shortform

"I LOVE Shortform as these are the BEST summaries I’ve ever seen...and I’ve looked at lots of similar sites. The 1-page summary and then the longer, complete version are so useful. I read Shortform nearly every day."
Jerry McPhee
Sign up for free

Thinking, Fast and Slow Summary Part 1-5: Biases of System 1

Putting it all together, we are most vulnerable to biases when:

  • System 1 forms a narrative that conveniently connects the dots and doesn’t express surprise.
  • Because of the cognitive ease by System 1, System 2 is not invoked to question the data. It merely accepts the conclusions of System 1.

In day-to-day life, this is acceptable if the conclusions are likely to be correct, the costs of a mistake are acceptable, and if the jump saves time and effort. You don’t question whether to brush your teeth each day, for example.

In contrast, this shortcut in thinking is risky when the stakes are high and there’s no time to collect more information, like when serving on a jury, deciding which job applicant to hire, or how to behave in an weather emergency.

We’ll end part 1 with a collection of biases.

What You See is All There Is: WYSIATI

When presented with evidence, especially those that confirm your mental model, you do not question what evidence might be missing. System 1 seeks to build the most coherent story it can - it does not stop to examine the quality and the quantity of information.

In an experiment, three groups were given background to a legal case....

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 2: Heuristics and Biases | 1: Statistical Mistakes

Kahneman transitions to Part 2 from Part 1 by explaining more heuristics and biases we’re subject to.

The general theme of these biases: we prefer certainty over doubt. We prefer coherent stories of the world, clear causes and effects. Sustaining incompatible viewpoints at once is harder work than sliding into certainty. A message, if it is not immediately rejected as a lie, will affect our thinking, regardless of how unreliable the message is.

Furthermore, we pay more attention to the content of the story than to the reliability of the data. We prefer simpler and coherent views of the world and overlook why those views are not deserved. We overestimate causal explanations and ignore base statistical rates. Often, these intuitive predictions are too extreme, and you will put too much faith in them.

This chapter will focus on statistical mistakes - when our biases make us misinterpret statistical truths.

The Law of Small Numbers

The smaller your sample size, the more likely you are to have extreme results. When you have small sample sizes, do NOT be misled by outliers.

A facetious example: in a series of 2 coin tosses, you are likely to get 100% heads....

What Our Readers Say

This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 2-2: Anchors

Anchoring describes the bias where you depend too heavily on an initial piece of information when making decisions.

In quantitative terms, when you are exposed to a number, then asked to estimate an unknown quantity, the initial number affects your estimate of the unknown quantity. Surprisingly, this happens even when the number has no meaningful relevance to the quantity to be estimated.

Examples of anchoring:

  • Students are split into two groups. One group is asked if Gandhi died before or after age 144. The other group is asked if Gandhi died before or after age 32. Both groups are then asked to estimate what age Gandhi actually died at. The first group, who were asked about age 144, estimated a higher age of death than students who were asked about age 32, with a difference in average guesses of over 15 years.
  • Students were shown a wheel of fortune game that had numbers on it. The game was rigged to show only the numbers 10 or 65. The students were then asked to estimate the % of African nations in the UN. The average estimates came to 25% and 45%, based on whether they were shown 10 or 65, respectively.
  • A nonprofit requested different amounts of...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 2-3: Availability Bias

When trying to answer the question “what do I think about X?,” you actually tend to think about the easier but misleading questions, “what do I remember about X, and how easily do I remember it?” The more easily you remember something, the more significant you perceive what you’re remembering to be. In contrast, things that are hard to remember are lowered in significance.

More quantitatively, when trying to estimate the size of a category or the frequency of an event, you instead use the heuristic: how easily do the instances come to mind? Whatever comes to your mind more easily is weighted as more important or true. This is the availability bias.

This means a few things:

  • Items that are easier to recall take on greater weight than they should.
  • When estimating the size of a category, like “dangerous animals,” if it’s easy to retrieve items for a category, you’ll judge the category to be large.
  • When estimating the frequency of an event, if it’s easy to think of examples, you’ll perceive the event to be more frequent.

In practice, this manifests in a number of ways:

  • Events that trigger stronger emotions (like terrorist attacks) are more readily...

Want to read the rest of this

Book Summary?

With Shortform, you can:

Access 1000+ non-fiction book summaries.

Highlight what
you want to remember.

Access 1000+ premium article summaries.

Take notes on your
favorite ideas.

Read on the go with our iOS and Android App.

Download PDF Summaries.

Sign up for free

Thinking, Fast and Slow Summary Part 2-4: Representativeness

Read the following description of a person.

Tom W. is meek and keeps to himself. He likes soft music and wears glasses. Which profession is Tom W. more likely to be? 1) Librarian. 2) Construction worker.

If you picked librarian without thinking too hard, you used the representativeness heuristic - you matched the description to the stereotype, while ignoring the base rates.

Ideally, you should have examined the base rate of both professions in the male population, then adjusted based on his description. Construction workers outnumber librarians by 10:1 in the US - there are likely more shy construction workers than all librarians!

More generally, the representativeness heuristic describes when we estimate the likelihood of an event by comparing it to an existing prototype in our minds - matching like to like. But just because something is plausible does not make it more probable.

The representativeness heuristic is strong in our minds and hard to overcome. In experiments, even when people receive data about base rates (like about the proportion of construction workers to librarians), people tend to ignore this information, trusting their stereotype...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 2-5: Overcoming the Heuristics

As we’ve been discussing, the general solution to overcoming statistical heuristics is by estimating the base probability, then making adjustments based on new data. Let’s work through an example.

Julie is currently a senior in a state university. She read fluently when she was four years old. What is her grade point average (GPA)?

People often compute this using intensity matching and representativeness, like so:

  • Reading fluently at 4 puts her at, say, the 90th percentile of all kids.
  • The 90th percentile GPA is somewhere around a 3.9.
  • Thus Julie likely has a 3.9 GPA.

Notice how misguided this line of thinking is! People are predicting someone’s academic performance 2 decades later based on how they behaved at 4. System 1 pieces together a coherent story about a smart kid becoming a smart adult.

The proper way to answer questions like these is as follows:

  • Start by estimating the average GPA - this is the base data if you had no information about the student whatsoever. Say this is 3.0.
  • Determine the GPA that matches your impression of the...

What Our Readers Say

This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 3: Overconfidence | 1: Flaws In Our Understanding

Part 3 explores biases that lead to overconfidence. With all the heuristics and biases described above working against us, when we construct satisfying stories about the world, we vastly overestimate how much we understand about the past, present, and future.

The general principle of the biases has been this: we desire a coherent story of the world. This comforts us in a world that may be largely random. If it’s a good story, you believe it.

Insidiously, the fewer data points you receive, the more coherent the story you can form. You often don’t notice how little information you actually have and don’t wonder about what is missing. You focus on the data you have, and you don’t imagine all the events that failed to happen (the nonevents). You ignore your ignorance.

And even if you’re aware of the biases, you are nowhere near immune to them. Even if you’re told that these biases exist, you often exempt yourself for being smart enough to avoid them.

The ultimate test of an explanation is whether it can predict future events accurately. This is the guideline by which you should assess the merits of your beliefs.

Narrative Fallacy

We desire packaging up a...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 3-2: Formulas Beat Intuitions

Humans have to make decisions from complicated datasets frequently. Doctors make diagnoses, social workers decide if foster parents are good, bank lenders measure business risk, and employers have to hire employees.

Unfortunately, humans are also surprisingly bad at making the right prediction. Universally in all studies, algorithms have beaten or matched humans in making accurate predictions. And even when algorithms match human performance, they still win because algorithms are so much cheaper.

Why are humans so bad? Simply put, humans overcomplicate things.

  • They inappropriately weigh factors that are not predictive of performance (like whether they like the person in an interview).
  • They try too hard to be clever, considering complex combinations of features when simply weighted features are sufficient.
  • Their judgment varies moment to moment without them realizing it. System 1 is very susceptible to influences without the conscious mind realizing. The person’s environment, current mood, state of hunger, and recent exposure to information can all influence decisions. Algorithms don’t feel hunger.
    • As an example, radiologists who read the same...

What Our Readers Say

This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 3-3: The Objective View

We are often better at analyzing external situations (the “outside view”) than our own. When you look inward at yourself (the “inside view”), it’s too tempting to consider yourself exceptional— “the average rules and statistics don’t apply to me!” And even when you do get statistics, it’s easy to discard them, especially when they conflict with your personal impressions of the truth.

In general, when you have information about an individual case, it’s tempting to believe the case is exceptional, and to disregard statistics of the class to which the case belongs.

Here are examples of situations where people ignore base statistics and hope for the exceptional:

  • 90% of drivers state they’re above average drivers. Here they don’t necessarily think about what “average” means statistically—instead, they think about whether the skill is easy for them, then intensity match to where they fit the population.
  • Most people believe they are superior to most others on most desirable traits.
  • When getting consultations, lawyers may refuse to comment on the projected outcome of a case, saying “every case is unique.”
  • Business owners know that only 35% of new businesses...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 4: Choices | 1: Prospect Theory

Part 4 of Thinking, Fast and Slow departs from cognitive biases and toward Kahneman’s other major work, Prospect Theory. This covers risk aversion and risk seeking, our inaccurate weighting of probabilities, and sunk cost fallacy.

Prior Work on Utility

How do people make decisions in the face of uncertainty? There’s a rich history spanning centuries of scientists and economists studying this question. Each major development in decision theory revealed exceptions that showed the theory’s weaknesses, then led to new, more nuanced theories.

Expected Utility Theory

Traditional “expected utility theory” asserts that people are rational agents that calculate the utility of each situation and make the optimum choice each time.

If you preferred apples to bananas, would you rather have a 10% chance of winning an apple, or 10% chance of winning a banana? Clearly you’d prefer the former.

Similarly, when taking bets, this model assumes that people calculate the expected value and choose the best option.

This is a simple, elegant theory that by and large works and is still taught in intro economics. But it failed to explain the phenomenon of risk aversion, where in...

What Our Readers Say

This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 4-2: Implications of Prospect Theory

With the foundation of prospect theory in place, we’ll explore a few implications of the model.

Probabilities are Overweighted at the Edges

Consider which is more meaningful to you:

  • Going from 0% chance of winning $1 million to 5% chance
  • Going from 5% chance of winning $1 million to 10% chance

Most likely you felt better about the first than the second. The mere possibility of winning something (that may still be highly unlikely) is overweighted in its importance. (Shortform note: as Jim Carrey’s character said in the film Dumb and Dumber, in response to a woman who gave him a 1 in million shot at being with her: “so you’re telling me there’s a chance!”)

More examples of this effect:

We fantasize about small chances of big gains.

  • Lottery tickets and gambling in general play on this hope.
  • A small sliver of chance to rescue a failing company is given outsized weight.

We obsess about tiny chances of very bad outcomes.

  • The risk of nuclear disasters and natural disasters is overweighted.
  • We worry about our child coming home late at night, though rationally we know there’s little...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 4-3: Variations on a Theme of Prospect Theory

Indifference Curves and the Endowment Effect

Basic theory suggests that people have indifference curves when relating two dimensions, like salary and number of vacation days. Say that you value one day’s salary at about the same as one vacation day.

Theoretically, you should be willing to trade for any other portion of the indifference curve at any time. So when at the end of the year, your boss says you’re getting a raise, and you have the choice of 5 extra days of vacation or a salary raise equivalent to 5 days of salary, you see them as pretty equivalent.

But say you get presented with another scenario. Your boss presents a new compensation package, saying that you can get 5 extra days of vacation per year, but then have to take a cut of salary equivalent to 5 days of pay. How would you feel about this?

Likely, the feeling of loss aversion kicked in. Even though theoretically you were on your indifference curve, exchanging 5 days of pay for 5 vacation days, you didn’t see this as an immediate exchange.

As with prospect theory, the idea of indifference curves ignores the reference point at which you start. In general, people have inertia to change.

They call...

What Our Readers Say

This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 4-4: Broad Framing and Global Thinking

When you evaluate a decision, you’re prone to focus on the individual decision, rather than the big picture of all decisions of that type. A decision that might make sense in isolation can become very costly when repeated many times.

Consider both decision pairs, then decide what you would choose in each:

Pair 1

1) A certain gain of $240.

2) 25% chance of gaining $1000 and 75% chance of nothing.

Pair 2

3) A certain loss of $750.

4) 75% chance of losing $1000 and 25% chance of losing nothing.

As we know already, you likely gravitated to Option 1 and Option 4.

But let’s actually combine those two options, and weigh against the other.

1+4: 75% chance of losing $760 and 25% chance of gaining $240

2+3: 75% chance of losing $750 and 25% chance of gaining $250

Even without calculating these out, 2+3 is clearly superior to 1+4. You have the same chance of losing less money, and the same chance of gaining more money. Yet you didn’t think to combine all unique pairings and combine them with each other!

This is the difference between narrow framing and broad framing. The ideal broad framing is to consider every combination of options to find the...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 5-1: The Two Selves of Happiness

Part 5 of Thinking, Fast and Slow departs from cognitive biases and mistakes and covers the nature of happiness.

(Shortform note: compared to the previous sections, the concepts in this final portion are more of Kahneman’s recent research interests and are more a work in progress. Therefore, they tend to have less experimental evidence and less finality in their conclusions.)

Happiness is a tricky concept. There is in-the-moment happiness, and there is overall well being. There is happiness we experience, and happiness we remember.

Consider having to get a number of painful shots a day. There is no habituation, so each shot is as painful as the last. Which one represents a more meaningful change?

  • Decreasing from 20 shots to 18 shots
  • Decreasing from 6 shots to 4 shots

You likely thought the latter was far more meaningful, especially since it drives more closely toward zero pain. But Kahneman found this incomprehensible. Two shots is two shots! There is a quantum of pain that is being removed, and the two choices should be evaluated as much closer.

In Kahneman’s view, someone who pays different amounts for the same gain of experienced utility is making a...

What Our Readers Say

This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 5-2: Experienced Well-Being vs Life Evaluations

Measuring Experienced Well-Being

How do you measure well-being? The traditional survey question reads: “All things considered, how satisfied are you with your life as a whole these days?”

Kahneman was suspicious that the remembering self would dominate the question, and that people were terrible at “considering all things.” The question tends to trigger the one thing that gives immense pleasurable (like dating a new person) or pain (like an argument with a co-worker).

To measure experienced well-being, he led a team to develop the Day Reconstruction Method, which prompts people to relive the day in detailed episodes, then to rate the feelings. Following the philosophy of happiness being the “area under the curve,” they conceived of the metric U-index: the percentage of time an individual spends in an unpleasant state.

They reported these findings:

  • There was large inequality in the distribution of pain. 50% of people reported going through a day without an unpleasant episode. But a minority experience considerable emotional distress for much of the day, for instance from illness, misfortune, or personal disposition.
  • Different activities have different...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Shortform Exclusive: Checklist of Antidotes

As an easy reference, here’s a checklist of antidotes covering every major bias and heuristic from the book.

Cognitive Biases and Heuristics

  • To block System 1 errors, recognize the signs that you’re in trouble and ask System 2 for reinforcement.
  • Observing errors in others is easier than in yourself. So ask others for review. In this way, organizations can be better than individuals at decision-making.
  • To better regulate your behavior, make critical choices in times of low duress so that System 2 is not taxed.
    • Order food in the morning, not when you’re tired after work or struggling to meet a deadline.
    • Notice when you’re likely to be in times of high duress, and put off big decisions to later. Don’t make big decisions when nervous about others watching.
  • In general, when estimating probability, begin with the baseline probability. Then adjust from this rate based on new data. Do NOT start with your independent guess of probability, since you ignore the data you don’t have.
  • WYSIATI
    • Force yourself to ask: “what evidence am I missing? What evidence would make me change my mind?”
  • Ordering effect
    • Before having a public...

What Our Readers Say

This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →