This article is an excerpt from the Shortform summary of "Thinking, Fast and Slow" by Daniel Kahneman. Shortform has the world's best summaries of books you should be reading.
Like this article? Sign up for a free trial here .
What is availability bias? When does it occur, and how can you avoid it?
Availability bias is the tendency to place more importance on information we can easily remember. The more easily you remember something, the more significant you perceive what you’re remembering to be. In contrast, things that are hard to remember are lowered in significance.
Learn how the availability bias, also known as the availability heuristic in psychology, hurts our thinking skills. We’ll cover the role of availability bias in the media and what you can do to overcome availability bias.
Availability Heuristic Bias
When trying to answer the question “what do I think about X?,” you actually tend to think about the easier but misleading questions, “what do I remember about X, and how easily do I remember it?” This is the availability bias at work.
More quantitatively, when trying to estimate the size of a category or the frequency of an event, you instead use the availability heuristic: how easily do the instances come to mind? Whatever comes to your mind more easily is weighted as more important or true. This is the availability bias.
Using the availability heuristic means a few things:
- Items that are easier to recall take on greater weight than they should.
- When estimating the size of a category, like “dangerous animals,” if it’s easy to retrieve items for a category, you’ll judge the category to be large.
- When estimating the frequency of an event, if it’s easy to think of examples, you’ll perceive the event to be more frequent.
How Availability Bias Manifests
In practice, the availability bias manifests in a number of ways:
- Events that trigger stronger emotions (like terrorist attacks) are more readily available than events that don’t (like diabetes), causing you to overestimate the importance of the more provocative events.
- More recent events are more available than past events, and are therefore judged to be more important.
- More vivid, visual examples are more available than mere words. For instance, it’s easier to remember the details of a painting than it is to remember the details of a passage of text. Consequently, we often value visual information over verbal.
- Personal experiences are more available than statistics or data.
- Famously, spouses were asked for their % contribution to household tasks. When you add both spouses’ answers, the total tends to be more than 100% – for instance, each spouse believes they contribute 70% of household tasks. Because of availability bias, one spouse primarily sees the work they had done and not their spouse’s contribution, and so each spouse believes they contributed unequally more.
- Items that are covered more in popular media take on a greater perceived importance than those that aren’t, even if the topics that aren’t covered have more practical importance.
Availability bias also tends to influence us to weigh small risks as too large. Parents who are anxiously waiting for their teenage child to come home at night are obsessing over the fears that are readily available to their minds, rather than the realistic, low chance that the child is actually in danger.
Availability Bias and the Media
Within the media, availability bias can cause a vicious cycle where something minor gets blown out of proportion:
- A minor curious event is reported. A group of people overreacts to the news.
- News about the overreaction triggers more attention and coverage of the event. Since media companies make money from reporting worrying news, they hop on the bandwagon and make it an item of constant news coverage.
- This continues snowballing as increasingly more people see this as a crisis.
- Naysayers who say the event is not a big deal are rejected as participating in a coverup.
- Eventually, all of this can affect real policy, where scarce resources are used to solve an overreaction rather than a quantitatively more important problem.
In Thinking, Fast and Slow, Kahneman cites the example of New York’s Love Canal in the 1970s, where buried toxic waste polluted a water well. Residents were outraged, and the media seized on the story, claiming it was a disaster. Eventually legislation was passed that mandated the expensive cleanup of toxic sites. Kahneman argues that the pollution has not been shown to have any actual health effects, and the money could have been spent on far more worthwhile causes to save more lives.
He also points to terrorism as today’s example of an issue that is reported widely by the media. As a result, terrorism is more available in our minds than the actual danger it presents, where a very small fraction of the population dies from terrorist attacks.
Kahneman is sympathetic to the biases, however—he notes that even irrational fear is debilitating, and policymakers need to protect the public from fear, not just real dangers.
Running Out of Availability Makes You Less Confident
A series of experiments asked people to list a number of instances of a situation (such as 6 examples of when they were assertive). Then they were asked to answer a question (such as “evaluate how assertive you are”).
Question: What has a greater effect on a person’s perception of how assertive they are—the number of examples they can come up with, or the ease of recall? In other words, does someone who comes up with 10 examples of when they are assertive feel more confident than someone who comes up with 3?
You might think more examples would strengthen conviction, but being forced to think of more examples actually lowers your confidence. When people are asked to name 6 examples of their assertiveness, they feel more assertive than those asked to name 12 examples. The difficulty of scraping up the last few examples dampens one’s confidence. This is another effect of availability bias.
Similarly, people are less confident in a belief when they’re asked to produce more arguments to support it. The act of scraping the bottom of the barrel for ideas gives you the feeling that your ideas are less available, which then weakens your belief.
There are some exceptions to this effect of availability bias:
- When people are given a cover story for their difficulty in recall, this effect dissipates.
- For example, in a variant of the assertiveness experiment above, subjects were told that listening to a particular piece of music would impair their recall ability. People who were asked to name 12 examples of assertiveness rated themselves equally assertive as those asked to name 6. Having a reason for being worse at recall prevented them from being demoralized by it.
- Profoundly, this means System 2 can influence how surprised System 1 feels. This is analogous to being told before you meet someone, “the man you’re about to meet has 7 fingers—don’t be surprised.”
- This effect reverses when someone has had personal experience with the situation.
- When asked about what behaviors would prevent disease, people with a family history of heart disease felt safer when they retrieved more instances. In this case, they weren’t bothered by how difficult the last items were to recall – the more they could think of, the better they felt.
Antidotes to Availability Bias
The conclusion is that System 1 uses ease of recall as a heuristic, while System 2 focuses on the content of what is being recalled, rather than just the ease. Therefore, you’re more susceptible to availability bias when System 2 is being taxed.
Experiments also show that you’re more susceptible to availability bias:
- When you’re in a good mood
- When you’re a novice in the field rather than an expert
- When you’re made to feel powerful and successful
Shortform note: To counteract availability bias, think deliberately about what you are recalling and assign weights to their significance. This will avoid overestimating things that are just easy to remember.
For example, when thinking of reasons for and against quitting your job, write down all the reasons, then score each reason by significance rather than biasing toward the reasons that you remember most easily.
When estimating the number of deaths by lightning strikes or diabetes, estimate it first from principle—how many people have diabetes, and how many have died from lightning strikes? What official numbers can you remember to ground your estimate? Don’t start from what you remember about each, whether it’s a news story about a lightning strike or a family member with diabetes.
———End of Preview———
Like what you just read? Read the rest of the world's best summary of "Thinking, Fast and Slow" at Shortform . Learn the book's critical concepts in 20 minutes or less .
Here's what you'll find in our full Thinking, Fast and Slow summary :
- Why we get easily fooled when we're stressed and preoccupied
- Why we tend to overestimate the likelihood of good things happening (like the lottery)
- How to protect yourself from making bad decisions and from scam artists