Thinking, Fast and Slow concerns a few major questions: how do we make decisions? And in what ways do we make decisions poorly?
The book covers three areas of Daniel Kahneman’s research: cognitive biases, prospect theory, and happiness.
Kahneman defines two systems of the mind.
System 1: operates automatically and quickly, with little or no effort, and no sense of voluntary control
System 2: allocates attention to the effortful mental activities that demand it, including complex computations. Often associated with the subjective experience of agency, choice and concentration
System 1 automatically generates suggestions, feelings, and intuitions for System 2. If endorsed by System 2, intuitions turn into beliefs, and impulses turn into voluntary actions.
System 1 can be completely involuntary. You can’t stop your brain from completing 2 + 2 = ?, or from considering a cheesecake as delicious. You can’t unsee optical illusions, even if you rationally know what’s going on.
A lazy System 2 accepts what the faulty System 1 gives it, without questioning. This leads to cognitive biases. Even worse, cognitive strain taxes System 2, making it more willing to accept System 1. Therefore, we’re more vulnerable to cognitive biases when we’re stressed.
Because System 1 operates automatically and can’t be turned off, biases are difficult to prevent. Yet it’s also not wise (or energetically possible) to constantly question System 1, and System 2 is too slow to substitute in routine decisions. We should aim for a compromise: recognize situations when we’re vulnerable to mistakes, and avoid large mistakes when the stakes are high.
Despite all the complexities of life, notice that you’re rarely stumped. You rarely face situations as mentally taxing as having to solve 9382 x 7491 in your head.
Isn’t it profound how we can make decisions without realizing it? You like or dislike people before you know much about them; you feel a company will succeed or fail without really analyzing it.
When faced with a difficult question, System 1 substitutes an easier question, or the heuristic question. The answer is often adequate, though imperfect.
Consider the following examples of heuristics:
These are related, but imperfect questions. When System 1 produces an imperfect answer, System 2 has the opportunity to reject this answer, but a lazy System 2 often endorses the heuristic without much scrutiny.
Confirmation bias: We tend to find and interpret information in a way that confirms our prior beliefs. We selectively pay attention to data that fit our prior beliefs and discard data that don’t.
“What you see is all there is”: We don’t consider the global set of alternatives or data. We don’t realize the data that are missing. Related:
Ignoring reversion to the mean: If randomness is a major factor in outcomes, high performers today will suffer and low performers will improve, for no meaningful reason. Yet pundits will create superficial causal relationships to explain these random fluctuations in success and failure, observing that high performers buckled under the spotlight, or that low performers lit a fire of motivation.
Anchoring: When shown an initial piece of information, you bias toward that information, even if it’s irrelevant to the decision at hand. For instance, in one study, when a nonprofit requested $400, the average donation was $143; when it requested $5, the average donation was $20. The first piece of information (in this case, the suggested donation) influences our decision (in this case, how much to donate), even though the suggested amount shouldn’t be relevant to deciding how much to give.
Representativeness: You tend to use your stereotypes to make decisions, even when they contradict common sense statistics. For example, if you’re told about someone who is meek and keeps to himself, you’d guess the person is more likely to be a librarian than a construction worker, even though there are far more of the latter than the former in the country.
Availability bias: Vivid images and stronger emotions make items easier...
Unlock the full book summary of Thinking, Fast and Slow by signing up for Shortform .
Shortform summaries help you learn 10x better by:
READ FULL SUMMARY OF THINKING, FAST AND SLOW
Here's a preview of the rest of Shortform's Thinking, Fast and Slow summary:
We believe we’re being rational most of the time, but really much of our thinking is automatic, done subconsciously by instinct. Most impressions arise without your knowing how they got there. Can you pinpoint exactly how you knew a man was angry from his facial expression, or how you could tell that one object was farther away than another, or why you laughed at a funny joke?
This becomes more practically important for the decisions we make. Often, we’ve decided what we’re going to do before we even realize it. Only after this subconscious decision does our rational mind try to justify it.
The brain does this to save on effort, substituting easier questions for harder questions. Instead of thinking, “should I invest in Tesla stock? Is it priced correctly?” you might instead think, “do I like Tesla cars?” The insidious part is, you often don’t notice the substitution. This type of substitution produces systematic errors, also called biases. We are blind to our blindness.
In Thinking, Fast and Slow, Kahneman defines two systems of the mind:
System 1: operates automatically and quickly, with little or no effort, and no...
System 2 thinking has a limited budget of attention - you can only do so many cognitively difficult things at once.
This limitation is true when doing two tasks at the same time - if you’re navigating traffic on a busy highway, it becomes far harder to solve a multiplication problem.
This limitation is also true when one task comes after another - depleting System 2 resources earlier in the day can lower inhibitions later. For example, a hard day at work will make you more susceptible to impulsive buying from late-night infomercials. This is also known as “ego depletion,” or the idea that you have a limited pool of willpower or mental resources that can be depleted each day.
All forms of voluntary effort - cognitive, emotional, physical - seem to draw at least partly on a shared pool of mental energy.
The law of least effort states that **“if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of...
This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
Think of your brain as a vast network of ideas connected to each other. These ideas can be concrete or abstract. The ideas can involve memories, emotions, and physical sensations.
When one node in the network is activated, say by seeing a word or image, it automatically activates its surrounding nodes, rippling outward like a pebble thrown in water.
As an example, consider the following two words:
“Bananas Vomit”
Suddenly, within a second, reading those two words may have triggered a host of different ideas. You might have pictured yellow fruits; felt a physiological aversion in the pit of your stomach; remembered the last time you vomited; thought about other diseases - all done automatically without your conscious control.
The evocations can be self-reinforcing - a word evokes memories, which evoke emotions, which evoke facial expressions, which evoke other reactions, and which reinforce other ideas.
Links between ideas consist of several forms:
In the next exercise, you’ll be shown three words....
System 1 continuously monitors what’s going on outside and inside the mind and generates assessments with little effort and without intention. The basic assessments include language, facial recognition, social hierarchy, similarity, causality, associations, and exemplars.
However, not every attribute of the situation is measured. System 1 is much better at determining comparisons between things and the average of things, not the sum of things. Here’s an example:
In the below picture, try to quickly determine what the average length of the lines is.
Now try to determine the sum of the length of the lines. This is less intuitive and requires System 2.
Unlike System 2 thinking, these basic assessments of System 1 are not impaired when the observer is cognitively busy.
In addition to basic assessments: System 1 also has two other...
"I LOVE Shortform as these are the BEST summaries I’ve ever seen...and I’ve looked at lots of similar sites. The 1-page summary and then the longer, complete version are so useful. I read Shortform nearly every day."
Putting it all together, we are most vulnerable to biases when:
In day-to-day life, this is acceptable if the conclusions are likely to be correct, the costs of a mistake are acceptable, and if the jump saves time and effort. You don’t question whether to brush your teeth each day, for example.
In contrast, this shortcut in thinking is risky when the stakes are high and there’s no time to collect more information, like when serving on a jury, deciding which job applicant to hire, or how to behave in an weather emergency.
We’ll end part 1 with a collection of biases.
When presented with evidence, especially those that confirm your mental model, you do not question what evidence might be missing. System 1 seeks to build the most coherent story it can - it does not stop to examine the quality and the quantity of information.
In an experiment, three groups were given background to a legal case....
Kahneman transitions to Part 2 from Part 1 by explaining more heuristics and biases we’re subject to.
The general theme of these biases: we prefer certainty over doubt. We prefer coherent stories of the world, clear causes and effects. Sustaining incompatible viewpoints at once is harder work than sliding into certainty. A message, if it is not immediately rejected as a lie, will affect our thinking, regardless of how unreliable the message is.
Furthermore, we pay more attention to the content of the story than to the reliability of the data. We prefer simpler and coherent views of the world and overlook why those views are not deserved. We overestimate causal explanations and ignore base statistical rates. Often, these intuitive predictions are too extreme, and you will put too much faith in them.
This chapter will focus on statistical mistakes - when our biases make us misinterpret statistical truths.
The smaller your sample size, the more likely you are to have extreme results. When you have small sample sizes, do NOT be misled by outliers.
A facetious example: in a series of 2 coin tosses, you are likely to get 100% heads....
This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
Anchoring describes the bias where you depend too heavily on an initial piece of information when making decisions.
In quantitative terms, when you are exposed to a number, then asked to estimate an unknown quantity, the initial number affects your estimate of the unknown quantity. Surprisingly, this happens even when the number has no meaningful relevance to the quantity to be estimated.
Examples of anchoring:
When trying to answer the question “what do I think about X?,” you actually tend to think about the easier but misleading questions, “what do I remember about X, and how easily do I remember it?” The more easily you remember something, the more significant you perceive what you’re remembering to be. In contrast, things that are hard to remember are lowered in significance.
More quantitatively, when trying to estimate the size of a category or the frequency of an event, you instead use the heuristic: how easily do the instances come to mind? Whatever comes to your mind more easily is weighted as more important or true. This is the availability bias.
This means a few things:
In practice, this manifests in a number of ways:
With Shortform, you can:
Access 1000+ non-fiction book summaries.
Highlight what
you want to remember.
Access 1000+ premium article summaries.
Take notes on your
favorite ideas.
Read on the go with our iOS and Android App.
Download PDF Summaries.
Read the following description of a person.
Tom W. is meek and keeps to himself. He likes soft music and wears glasses. Which profession is Tom W. more likely to be? 1) Librarian. 2) Construction worker.
If you picked librarian without thinking too hard, you used the representativeness heuristic - you matched the description to the stereotype, while ignoring the base rates.
Ideally, you should have examined the base rate of both professions in the male population, then adjusted based on his description. Construction workers outnumber librarians by 10:1 in the US - there are likely more shy construction workers than all librarians!
More generally, the representativeness heuristic describes when we estimate the likelihood of an event by comparing it to an existing prototype in our minds - matching like to like. But just because something is plausible does not make it more probable.
The representativeness heuristic is strong in our minds and hard to overcome. In experiments, even when people receive data about base rates (like about the proportion of construction workers to librarians), people tend to ignore this information, trusting their stereotype...
As we’ve been discussing, the general solution to overcoming statistical heuristics is by estimating the base probability, then making adjustments based on new data. Let’s work through an example.
Julie is currently a senior in a state university. She read fluently when she was four years old. What is her grade point average (GPA)?
People often compute this using intensity matching and representativeness, like so:
Notice how misguided this line of thinking is! People are predicting someone’s academic performance 2 decades later based on how they behaved at 4. System 1 pieces together a coherent story about a smart kid becoming a smart adult.
The proper way to answer questions like these is as follows:
This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
Part 3 explores biases that lead to overconfidence. With all the heuristics and biases described above working against us, when we construct satisfying stories about the world, we vastly overestimate how much we understand about the past, present, and future.
The general principle of the biases has been this: we desire a coherent story of the world. This comforts us in a world that may be largely random. If it’s a good story, you believe it.
Insidiously, the fewer data points you receive, the more coherent the story you can form. You often don’t notice how little information you actually have and don’t wonder about what is missing. You focus on the data you have, and you don’t imagine all the events that failed to happen (the nonevents). You ignore your ignorance.
And even if you’re aware of the biases, you are nowhere near immune to them. Even if you’re told that these biases exist, you often exempt yourself for being smart enough to avoid them.
The ultimate test of an explanation is whether it can predict future events accurately. This is the guideline by which you should assess the merits of your beliefs.
We desire packaging up a...
Humans have to make decisions from complicated datasets frequently. Doctors make diagnoses, social workers decide if foster parents are good, bank lenders measure business risk, and employers have to hire employees.
Unfortunately, humans are also surprisingly bad at making the right prediction. Universally in all studies, algorithms have beaten or matched humans in making accurate predictions. And even when algorithms match human performance, they still win because algorithms are so much cheaper.
Why are humans so bad? Simply put, humans overcomplicate things.
This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
We are often better at analyzing external situations (the “outside view”) than our own. When you look inward at yourself (the “inside view”), it’s too tempting to consider yourself exceptional— “the average rules and statistics don’t apply to me!” And even when you do get statistics, it’s easy to discard them, especially when they conflict with your personal impressions of the truth.
In general, when you have information about an individual case, it’s tempting to believe the case is exceptional, and to disregard statistics of the class to which the case belongs.
Here are examples of situations where people ignore base statistics and hope for the exceptional:
Part 4 of Thinking, Fast and Slow departs from cognitive biases and toward Kahneman’s other major work, Prospect Theory. This covers risk aversion and risk seeking, our inaccurate weighting of probabilities, and sunk cost fallacy.
How do people make decisions in the face of uncertainty? There’s a rich history spanning centuries of scientists and economists studying this question. Each major development in decision theory revealed exceptions that showed the theory’s weaknesses, then led to new, more nuanced theories.
Traditional “expected utility theory” asserts that people are rational agents that calculate the utility of each situation and make the optimum choice each time.
If you preferred apples to bananas, would you rather have a 10% chance of winning an apple, or 10% chance of winning a banana? Clearly you’d prefer the former.
Similarly, when taking bets, this model assumes that people calculate the expected value and choose the best option.
This is a simple, elegant theory that by and large works and is still taught in intro economics. But it failed to explain the phenomenon of risk aversion, where in...
This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
With the foundation of prospect theory in place, we’ll explore a few implications of the model.
Consider which is more meaningful to you:
Most likely you felt better about the first than the second. The mere possibility of winning something (that may still be highly unlikely) is overweighted in its importance. (Shortform note: as Jim Carrey’s character said in the film Dumb and Dumber, in response to a woman who gave him a 1 in million shot at being with her: “so you’re telling me there’s a chance!”)
More examples of this effect:
We fantasize about small chances of big gains.
We obsess about tiny chances of very bad outcomes.
Basic theory suggests that people have indifference curves when relating two dimensions, like salary and number of vacation days. Say that you value one day’s salary at about the same as one vacation day.
Theoretically, you should be willing to trade for any other portion of the indifference curve at any time. So when at the end of the year, your boss says you’re getting a raise, and you have the choice of 5 extra days of vacation or a salary raise equivalent to 5 days of salary, you see them as pretty equivalent.
But say you get presented with another scenario. Your boss presents a new compensation package, saying that you can get 5 extra days of vacation per year, but then have to take a cut of salary equivalent to 5 days of pay. How would you feel about this?
Likely, the feeling of loss aversion kicked in. Even though theoretically you were on your indifference curve, exchanging 5 days of pay for 5 vacation days, you didn’t see this as an immediate exchange.
As with prospect theory, the idea of indifference curves ignores the reference point at which you start. In general, people have inertia to change.
They call...
This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
When you evaluate a decision, you’re prone to focus on the individual decision, rather than the big picture of all decisions of that type. A decision that might make sense in isolation can become very costly when repeated many times.
Consider both decision pairs, then decide what you would choose in each:
Pair 1
1) A certain gain of $240.
2) 25% chance of gaining $1000 and 75% chance of nothing.
Pair 2
3) A certain loss of $750.
4) 75% chance of losing $1000 and 25% chance of losing nothing.
As we know already, you likely gravitated to Option 1 and Option 4.
But let’s actually combine those two options, and weigh against the other.
1+4: 75% chance of losing $760 and 25% chance of gaining $240
2+3: 75% chance of losing $750 and 25% chance of gaining $250
Even without calculating these out, 2+3 is clearly superior to 1+4. You have the same chance of losing less money, and the same chance of gaining more money. Yet you didn’t think to combine all unique pairings and combine them with each other!
This is the difference between narrow framing and broad framing. The ideal broad framing is to consider every combination of options to find the...
Part 5 of Thinking, Fast and Slow departs from cognitive biases and mistakes and covers the nature of happiness.
(Shortform note: compared to the previous sections, the concepts in this final portion are more of Kahneman’s recent research interests and are more a work in progress. Therefore, they tend to have less experimental evidence and less finality in their conclusions.)
Happiness is a tricky concept. There is in-the-moment happiness, and there is overall well being. There is happiness we experience, and happiness we remember.
Consider having to get a number of painful shots a day. There is no habituation, so each shot is as painful as the last. Which one represents a more meaningful change?
You likely thought the latter was far more meaningful, especially since it drives more closely toward zero pain. But Kahneman found this incomprehensible. Two shots is two shots! There is a quantum of pain that is being removed, and the two choices should be evaluated as much closer.
In Kahneman’s view, someone who pays different amounts for the same gain of experienced utility is making a...
This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.
How do you measure well-being? The traditional survey question reads: “All things considered, how satisfied are you with your life as a whole these days?”
Kahneman was suspicious that the remembering self would dominate the question, and that people were terrible at “considering all things.” The question tends to trigger the one thing that gives immense pleasurable (like dating a new person) or pain (like an argument with a co-worker).
To measure experienced well-being, he led a team to develop the Day Reconstruction Method, which prompts people to relive the day in detailed episodes, then to rate the feelings. Following the philosophy of happiness being the “area under the curve,” they conceived of the metric U-index: the percentage of time an individual spends in an unpleasant state.
They reported these findings:
As an easy reference, here’s a checklist of antidotes covering every major bias and heuristic from the book.
Cognitive Biases and Heuristics
This is the best summary of How to Win Friends and Influence PeopleI've ever read. The way you explained the ideas and connected them to other books was amazing.