This article is an excerpt from the Shortform summary of "Thinking, Fast and Slow" by Daniel Kahneman. Shortform has the world's best summaries of books you should be reading.
Like this article? Sign up for a free trial here .
What is overconfidence bias? How do you avoid it?
Overconfidence bias is when a person feels more confident in the accuracy of his or her judgment than objective standards would indicate. Overconfidence bias can lead to bad decisions and faulty predictions.
Learn what overconfidence bias is, see examples of different types of overconfidence bias, and learn how to avoid the overconfidence effect.
Overconfidence Bias: Flaws In Our Understanding
With all the heuristics and biases described below working against us, when we construct satisfying stories about the world, we vastly overestimate how much we understand about the past, present, and future. This leads to overconfidence bias.
The general principle of the overconfidence biases has been this: we desire a coherent story of the world. This comforts us in a world that may be largely random. If it’s a good story, you believe it.
Insidiously, the fewer data points you receive, the more coherent the story you can form. You often don’t notice how little information you actually have and don’t wonder about what is missing. You focus on the data you have, and you don’t imagine all the events that failed to happen (the nonevents). You ignore your ignorance.
And even if you’re aware of the types of overconfidence effects, you are nowhere near immune to them. Even if you’re told that these biases exist, you often exempt yourself for being smart enough to avoid them.
The ultimate test of an explanation is whether it can predict future events accurately. This is the guideline by which you should assess the merits of your beliefs.
We’ll look at some of the subtypes of overconfidence bias and fallacies that contribute to overconfidence effects, below.
Narrative Fallacy
We desire packaging up a messy world into a clean-cut story. It is unsatisfying to believe that outcomes are based largely on chance, partially because this makes the future unpredictable. But in a world of randomness, regular patterns are often illusions. The narrative fallacy leads to overconfidence bias because it makes us feel confident that we understand the events around us.
Here are a few examples of narrative fallacy:
- History is presented as an inevitable march of a sequence of events, rather than a chaotic mishmash of influences and people. If the past were so easily predictable in hindsight, then why is it so hard to predict the future?
- Management literature also tries to find patterns to management systems that predict success. Often, the results are disappointing and not enduring.
- The correlation between a firm’s success and the quality of its CEO might be as high as .30, which is lower than what most people might guess. Practically, this correlation of .30 suggests that the stronger CEO would lead the stronger firm in 60% of pairs, just 10% better than chance.
- We readily trust our judgments in situations that are poor representatives of real performance (like in job interviews).
Even knowing the narrative fallacy and how it contributes to overconfidence bias, you might still be tempted to write a narrative that makes sense—for example, successful companies become complacent, while underdogs try harder, so that’s why reversion to the mean happens. Kahneman says this is the wrong way to think about it—the gap between high performers and low performers must shrink, because part of the outcome was due to luck. It’s pure statistics.
Antidotes to Narrative Fallacy
Be wary of highly consistent patterns from comparing more successful and less successful examples. You don’t know lots of things—whether the samples were cherrypicked, whether the failed results were excluded from the dataset, and other experimental tricks. Don’t succumb to the overconfidence bias.
Be wary of people who declare very high confidence around their explanation. This suggests they’ve constructed a coherent story, not necessarily that the story is true.
Hindsight Bias
Another facet of overconfidence bias is hindsight bias. Once we know the outcome, we connect the dots in the past that make the outcome seem inevitable and predictable.
Insidiously, you don’t remember how uncertain you were in the past—once the outcome is revealed, you believe your past self was much more certain than you actually were! It might even be difficult to believe you ever felt differently. In other words, “I knew it all along.” You rewrite the history of your mind. This is related to overconfidence bias.
- Imagine all the people who believe they foresaw the 2000 dotcom bubble bursting or the 2008 financial crisis happening.
- Professional forecasters (eg experts who show up on talk shows) perform no better than chance in predicting events. This might also be because they’re hired for their charisma and vocalness, not accuracy.
Hindsight bias is a problem because it inflates our confidence about predicting the future. If we are certain that our past selves were amazing predictors of the future, we believe our present selves to be no worse.
Outcome Bias
Related to hindsight bias, outcome bias is the tendency to evaluate the quality of a decision when the outcome is already known. People who succeeded are assumed to have made better decisions than people who failed. This is related to overconfidence bias.
This causes a problem where people are rewarded and punished based on outcome, not on their prior beliefs and their appropriate actions. People who made the right decision but failed are punished more than those who took irresponsible risks that happened to work out.
(Shortform note: to push the logic further, this causes problems in the future for continuing success. People who got lucky will be promoted but won’t be able to replicate their success. In contrast, the people who made good decisions won’t be promoted and in the position to succeed in the future.)
An example of outcome bias:
- After September 11th, the US government was criticized for ignoring information about al-Qaeda in July 2001. But this ignores how little they could predict that September 11th would result from that piece of information.
(Shortform note: antidotes to hindsight and outcome bias include:
- Keeping a journal of your current beliefs and what you estimate the outcomes to be. In the future, once the outcomes are known, reflect on your beliefs at the time to see how accurate you were.
- Rewarding people based on the decisions they make at the time with the information they had, before the outcomes come out. Don’t reward people who took outlandish risks but got lucky.)
Dangers of Overconfidence Biases
Even when presented with data of your poor predictions, you do not tend to adjust your confidence in your predictions. You forge on ahead, confident as always, discarding the news. The overconfidence bias is strong.
Daniel Kahneman argues the entire industry of the stock market is built on an illusion of skill and overconfidence bias. People know that on average, investors do not beat market returns (by definition, since the market is an average of all traders in the market, this must be the case). And plenty of studies show that retail investors trade poorly, against best practices—they sell rising stocks to lock in the gains, and they hang on to their losers out of hope, even though both are exact opposites of what they should do. In turn, large professional investors are happy to take advantage of these mistakes. But retail traders continue marching on, believing they have more skill than they really do.
Here are many reasons it’s so difficult to believe randomness is the primary factor in your outcomes, that your skill is worse than you think, and that you’re vulnerable to overconfidence bias:
- Pride and ego are at stake.
- The more famous the forecaster, the more overconfident and flamboyant the predictions.
- Experts resist admitting they’re wrong, instead giving excuses: they were wrong only in timing; they would have been right, but an unforeseeable event had intervened; or they were wrong, but for the right reasons.
- You take deliberate, skillful steps to guide the outcome. By producing a lot of motion, you think that you can’t be wrong.
- Stock analysts pore over financial statements and build models. This requires lots of training and makes stock picking seem more rigorous. But this doesn’t answer the real, more difficult question – is the information already priced into the stock?
- Managers focus on the strength of their strategy and how good their company seems, discounting what their competitors are doing and market changes (“competition neglect”).
- Your experience shows many instances where your predictions came true —partially because those are most available to you, and you discount your mistakes.
- You focus on the causal role of skill and neglect the role of luck – the illusion of control.
- You don’t know what you don’t know. You aren’t aware of most of the factors that influence the final outcome, focusing on only the patterns that you do see.
- Large monetary incentives are at stake. You are being paid for your skill – if your skill turns out to be irrelevant, you’ll lose your job.
- A survey of CFOs asked them to predict the “80% confidence interval” of stock market returns. When you ask someone to make an estimate, the 80% confidence interval is the range between a value the person is 90% sure is too high to be correct and a value the person is 90% sure is too low to be correct. Any result outside this range is considered a “surprise.” The CFOs’ guesses were far too narrow. An accurate 80% confidence interval would only be surprised 20% of the time; but the survey results showed surprises 67% of the time. The real 80% confidence interval was between -10% and +30% returns that year—but any CFO who says this would be criticized as lacking any knowledge.
- It’s a sign of weakness to be unsure. It can have material marketing consequences to customers or investors, or to staff who want more certainty.
- Strong social proof in your community can maintain a belief in skill—if all these other smart people believe skill influences the outcome, then they can’t be wrong.
- Emergencies call for decisive action. In stressful situations, people desire certainty even more. Ambivalent decision making is criticized during these stressful times.
Understanding overconfidence bias is the first step toward overcoming it. But it’s not easy.
———End of Preview———
Like what you just read? Read the rest of the world's best summary of "Thinking, Fast and Slow" at Shortform . Learn the book's critical concepts in 20 minutes or less .
Here's what you'll find in our full Thinking, Fast and Slow summary :
- Why we get easily fooled when we're stressed and preoccupied
- Why we tend to overestimate the likelihood of good things happening (like the lottery)
- How to protect yourself from making bad decisions and from scam artists