This article is an excerpt from the Shortform book guide to "Fooled By Randomness" by Nassim Nicholas Taleb. Shortform has the world's best summaries and analyses of books you should be reading.
Like this article? Sign up for a free trial here .
What are cognitive shortcuts? What role do they play in survival? What are the dangers inherent in shortcut thinking?
Our brains have developed shortcuts of thinking that allow us to react quickly and decisively to threats. Unfortunately, these shortcuts often lead us to believe many things without fully thinking them through. As a result, our views of the world are often based on misunderstandings and biases we unwittingly hold.
We’ll consider a few of these misunderstandings and biases below.
What Are Cognitive Shortcuts?
Because the brain’s cognitive resources are limited, we’ve evolved many thinking shortcuts to save ourselves time and mental energy; if we were to stop and think thoroughly about each interaction we have throughout the day, we would either miss opportunities or succumb to threats.
However, because cognitive shortcuts are automatic, they often prevent us from correctly evaluating probabilities, and as a result, lead us to make poor decisions and take unjustified risks.
We Make Decisions Emotionally
Neurologists observe that the human brain has developed into three general parts: the primitive brain, the emotional brain, and the rational brain. The rational brain acts as an advisor, but it’s the other two parts—primitive and emotional—that are responsible for decision-making.
This is not inherently a bad thing. Our thoughts can advise us, but without a feeling to direct us toward one option or the other, we get caught in endless rational deliberations as to what’s the best course of action. This can be seen in patients who’ve had brain trauma that destroyed their ability to feel emotions but left them intelligent, making them completely rational beings. People with this sort of brain damage cannot make decisions even as simple as whether or not to get out of bed in the morning.
The negative side of this, of course, is that emotions can steer us wrong and cause us to make mistakes. Emotions can cloud our judgment by blocking out rational thinking and causing us to wrongly assess risk, thereby leading us to make poor decisions. For example, we might buy a particular stock because we love the company and get emotionally invested in its future, though it may not be financially wise to do so.
Feelings also steer us wrong because people are more emotionally impacted by negative events than positive ones. This means they also view volatility much more starkly when it involves lower prices than when it involves higher ones. Likewise, volatility during negative world events is seen as worse than volatility in peaceful times. For example, in the eighteen months leading to September 11, 2001, the market was more volatile than in the same period after, but people gave the later volatility much more media attention. As a result, people are more likely to make moves during times of stress, even if those moves are not strategically wise.
We Like Simplicity
To better identify risk, the primitive and emotional parts of our psyche have evolved to prioritize speed when scanning the environment for threats. Because of this, we don’t like complexity. We respond best to simple concepts that are easily understood and quickly summed up. Often we regard complex ideas with suspicion, assuming ill intent or falsehood.
Because of this, we tend to avoid concepts that feel difficult to explain, even when those concepts are more enlightening than simpler ones. We therefore tend to gloss over the finer points of probabilities, which are not only difficult to understand but are often also counter-intuitive.
For example, a study of how medical professionals interpret probabilities shed light on how often people who are supposed to know better, don’t. Doctors were asked this question: A disease affects one in 1,000 people in a given population. People are tested for it randomly with a test that has a 5 percent false positive rate and no false negatives. If someone tests positive, what is the percentage likelihood that she has the disease?
Most doctors responded by saying she’d be 95 percent likely to have it (since the test has a 95% accuracy rate). However, a person testing positive under these conditions would in fact only be 2 percent likely to be sick. (If 1,000 people are tested, only one will be sick, but an additional 50 will test falsely positive, for a total of 51 positive tests but only 1 actual illness. One divided by 51 is about 2 percent.) Fewer than one in five respondents answered correctly, as the right answer feels counter-intuitive.
(Shortform note: This does not mean that people are getting regularly treated for diseases they don’t have. The scenario doesn’t account for the human element of testing: Most people only get tested for a disease when they have symptoms of something, which increases the likelihood that a positive result does indicate sickness.
But the math holds true in real life for diseases that are uncommon but for which asymptomatic people get regularly tested—for example, breast cancer. There is a fairly high rate of false positives for mammograms, and the vast majority of those who test positive do not turn out to be sick. These false alarms are weeded out through further testing.)
We Notice Surprises
The primitive and emotional sections of our brain also pay much closer attention to surprises than to run-of-the-mill news. We attach greater significance to shocking events even if they are not ultimately important, and tend to believe events that are more easily recalled are more likely to occur. We therefore overestimate the risk of unlikely events while ignoring the risk of more likely ones.
We can see this in how the media covers bizarre but relatively unthreatening news while ignoring much more common—and more likely—threats. For example, in the 1990s, mad cow disease got fevered treatment from the media but only killed several hundred people over the course of a decade. You were far more likely to be killed in a car accident on the way to a restaurant than from the tainted meat you might eat there. But due to the skewed media focus, people became more frightened of the (unlikely) threat of mad cow disease than of threats they were far more likely to face.
We Dislike Abstraction
Because for most of human history people faced tangible threats rather than theoretical probabilities, our brains evolved to better understand concrete ideas rather than abstract ones, and consequently, we have trouble assessing the risks of abstract circumstances. Studies have shown that when presented with two sets of risks, people will be more concerned about the one that describes specific threats even if the more general threats would also include those specific threats.
For example, travelers are more likely to insure against a death from a terrorist threat on their trip than death from any reason (including, but not specifying, terrorism). In another example, a study found that people predicted an earthquake in California was more likely than an earthquake in North America (again, including but not specifying California).
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Nassim Nicholas Taleb's "Fooled By Randomness" at Shortform .
Here's what you'll find in our full Fooled By Randomness summary :
- The outsized role luck plays in success
- How we’re fooled by randomness in many aspects of our lives
- How we can accommodate randomness in our lives once we’re aware of it