This article is an excerpt from the Shortform book guide to "The Art of Thinking Clearly" by Rolf Dobelli. Shortform has the world's best summaries and analyses of books you should be reading.
Like this article? Sign up for a free trial here .
What is Rolf Dobelli’s The Art of Thinking Clearly about? What logical fallacies does Dobelli explore in the book?
The Art of Thinking Clearly by Rolf Dobelli is an introduction to the most common logical biases and thinking errors. Logical fallacies affect everyone and are extremely difficult to avoid. Rolf Dobelli encourages readers to improve their decisions by learning how to recognize the fallacies and how to work around them.
Below is a brief overview of the key themes.
You Want to Belong to the Group
In The Art of Thinking Clearly, Rolf Dobelli breaks down the most common logical fallacies that inhibit decision-making, including confirmation bias, social proof, and hindsight bias. He begins his discussion with fallacies that stem from group membership. He says that one of the traits that most influences you is the desire to be in a group. For early humans, group membership was necessary for survival. Those who left the group died, while those who stayed with the group survived and reproduced. Thus, your brain is genetically wired to fit in.
(Shortform note: The physical protection groups once offered has become less important, but group membership is still valuable to modern humans: It exposes you to others’ experiences and life skills and helps you develop empathy and self-worth.)
Social Proof and Authority Bias
To maintain your place in a group, you’re pressured to copy other people’s behavior, especially if that person is an authority. This pressure can even convince you to ignore your morals, Dobelli says.
(Shortform note: It’s hard to resist pressure from authorities because you’re trained from childhood to obey them. You can make this pressure easier to resist by distancing yourself from the authority and forming a connection with any victims of the problematic actions you’re being pressured into.)
In-Group, Out-Group Bias
Another consequence of group membership is prioritizing your own group above others. Your brain focuses on similarities between you and your group members, Dobelli says, ignoring any differences. Your brain also simplifies other groups, labeling them as “other” and ignoring any similarities you share with them. Finally, you think your group is the best because you only spend time with your group and don’t hear any different opinions.
(Shortform note: You prioritize your in-group because group membership becomes part of your identity: Protecting your group becomes protecting your identity. However, pinning your identity on membership of an in-group is dangerous, because in-groups can change depending on the situation—you may find yourself no longer a member of the group you value and feel unsure of your identity. You can avert this by increasing the variety of your groups so you don’t tie yourself too closely to just one. Spend time with people from different groups, focus on points of connection with others rather than differences, and recognize when your groups are formed around arbitrary means that could crumble.)
You Pay Attention to the Wrong Things
Now that we’ve covered biases relating to humanity’s desire to fit in, we’ll examine fallacies that misdirect your attention. Humans tend to pay attention to the most memorable or flashy information that comes up, rather than the most pertinent or helpful, Dobelli explains. (Shortform note: The brain has arguably evolved to do this for reasons of efficiency. Your brain rapidly stores information, and it takes it less time and energy to accept the flashiest information available than to evaluate the entire situation.)
The Salience Effect
According to Dobelli, when the salience effect takes hold, your brain latches onto unusual or notable factors of a situation and gives them too much credit for causing the situation, ignoring any more subtle influences. (Shortform note: What you pay attention to when the effect sets in depends on your past experiences. You might notice certain details due to your career or past experiences while someone else would notice different things. Thus, surrounding yourself with people with varied experiences can give you a clearer idea of the whole situation, as you can compare notes on the different things you’ve noticed.)
Story Bias
Another way people’s attention is misdirected is through story bias. People prefer entertaining fiction to boring facts, Dobelli explains. Sometimes, this means following an interesting story-based tangent while ignoring the central issue; other times, it means assigning meaning to random events.
(Shortform note: Why do we do this? Possibly because stories activate your brain’s sensory processing center. This gives you a shot of positive feelings like compassion and empathy.)
How can you overcome story bias? Dobelli suggests picking stories apart rather than blindly consuming and accepting them: Consider the story-teller’s intentions and what they might be hiding.
(Shortform note: In addition, be cautious about what kind of stories you consume. The stronger and more convenient the narrative, the more suspicious you should be of it, because strong narratives can distract from logical gaps in a person’s statement.)
Survivorship Bias
A final bias that Dobelli claims unproductively diverts your attention is survivorship bias: the belief that you have a better chance of succeeding than you actually do because success stories are more widely publicized than failures. Dobelli suggests researching statistics and examples of failure in the field you’re considering to gain a more accurate perspective of success. (Shortform note: Start your research by asking “What am I missing?” and looking for holes in your data. In addition, be careful when researching: If your sources are incomplete or suffering from survivorship bias themselves, they won’t help you ascertain reality.)
You’re Using the Wrong Kind of Thinking
The next set of fallacies we’ll look at revolve around the kind of thinking you use. Dobelli says there are two main kinds of thinking: Fast and instinctive, and slow and logical. (Shortform note: These types of thinking are also called hot and cold cognition. The first is influenced by instinct and emotions, while the second is based on logic and reasoning.)
Fast, instinctive thinking is useful when actions are familiar or something you’ve been evolutionarily optimized for, Dobelli says. (Shortform note: Malcolm Gladwell argues in Blink that you can use your instinctive thinking more often—for instance, even in unfamiliar situations—as long as you train it correctly to avoid coming to incorrect snap judgments. You can improve the accuracy of your instinctive thinking by expanding your worldview, so that your instincts have more experience to draw on, as well as by paying attention to context, so that you know what type of experience to draw on.)
When you’re trying something new, or you’re in a complex situation you’re not instinctually prepared for, use slow logic, Dobelli adds. (Shortform note: Trigger logical thinking by increasing your task’s difficulty. For example, making the font smaller when analyzing a news article increases the effort you put into the task and triggers logical thinking.)
The Conjunction Fallacy
One incorrect use of cognition is the tendency to prefer a plausible story to a probable one, Dobelli explains. In other words, Dobelli explains, when a story makes sense to you, you’re likely to believe it even if the true probability of it occurring is low. For example, consider a girl named Katrina who loves musicals. Now consider these statements:
- Katrina performs on stage.
- Katrina performs on stage in a musical.
Many people pick the second option as most likely because it makes a better story: Katrina loves musicals, so she’d perform in a musical. However, the first option is actually more likely in terms of probability because it’s broader: It has just one condition (Katrina being on stage) rather than two (Katrina being on stage and in a musical).
This tendency, called the conjunction fallacy, occurs when you use fast thinking instead of slow. While your logical brain is still calculating the probability of an event, your instinctive thought process makes connections to explain why the event might occur. The connections it finds often form a plausible story, so you accept the instinctive connection rather than waiting for the logical probability.
(Shortform note: While Dobelli presents the conjunction fallacy as something instinctual and common, others argue that he exaggerates the danger of said fallacy. Studies show that the conjunction fallacy isn’t as widespread as the original researchers, whose research Dobelli relies on, suggested. Only 45% of respondents succumbed to the fallacy, compared to 85% in the original study. In addition, the new researchers discovered that the conjunction fallacy can be avoided by simply discussing a situation with another person.)
The Affect Heuristic
Part of humanity’s fast, instinctive thinking is the affect heuristic: a mental shortcut in which your brain makes rapid subconscious judgments of like or dislike. These “affects” influence your risk-benefit analyses: If your immediate judgment is good, you’ll focus on the benefits of a situation, while if the affect is bad, you’ll focus on the risks.
(Shortform note: These instinctive judgements can help you make decisions. This was particularly helpful to early humans because it quickly provided more data for risk analysis, and early humans faced regular life-threatening risks. However, the information required for a good risk-benefit analysis in modern times is too complex for the affect heuristic to assess adequately, especially when emotions are involved. Thus, rather than being beneficial, the heuristic can lead you to make a risky decision merely because you’re excited or delay a good decision because you’re sad.)
You Struggle to Understand Complex Math
The next set of fallacies we’ll look at revolve around complex math. Your hunter-gatherer brain isn’t designed for complex math, Dobelli says. This means you can’t instinctively grasp complex math concepts, but understanding these concepts is increasingly important for modern life.
(Shortform note: Some people argue that people’s difficulty with complex math concepts is a result of how math is viewed. People internalize the idea that math is difficult and those who struggle with the concepts stop trying to understand them. If math was treated like a language, which takes practice but can be learned by anyone, people would learn complex math easier.)
Here are some situations in which struggling with math negatively impacts your decisions:
The Distribution of Averages
Averages are one of the complex math concepts that your brain isn’t evolutionarily prepared for, Dobelli explains. One of the biggest pitfalls when working with averages is ignoring the distribution: the original set of numbers used to calculate the average. Without knowing the distribution, averages are misleading because they don’t show the outliers: the extremes at either end of the distribution that drastically change the average. To get a true average, these outliers must be removed, Dobelli says. This isn’t instinctive, but it’s important for modern life because outliers are increasingly common.
Self-Selection Bias
Statistics is another area of math that you’re not evolutionarily primed for, Dobelli says. One statistical error is self-selection bias, in which the nature of the participants in a study influences its outcome. Specifically, people only join studies they’re comfortable with responding to, which alters your data, Dobelli says. Those who might provide embarrassing or somehow “undesirable” responses simply won’t take part, narrowing your study’s scope and skewing the results.
(Shortform note: The only way to completely eliminate these problems is studying people who don’t know they’re being studied. However, researchers must receive consent from participants, so this isn’t practical. That said, you can limit self-selection bias. Most studies do this by collecting demographic information: Researchers look for patterns in the demographics of people who chose to participate and alter how they weigh the responses to reduce self-selection bias. Specifically, they give greater weight to results from those less likely to self-select, and lesser weight to results from those likely to self-select.)
Your Memory Is Not as Reliable as You Think It Is
Next, we’ll cover fallacies related to memory. People believe their memories are untouchable, stored away and recalled when needed in perfect condition. However, this isn’t the case, Dobelli warns. Your memory is affected by your feelings, opinions, and situation.
(Shortform note: Your memories are affected in these ways at several points: First, whatever you were feeling in the moment is tangled up with the actual situation in your memory; later, every time you remember the situation, your current mental state further alters your memories. Thus, the more you remember a situation, the more distorted the memory becomes.)
Falsification of History
The main reason your memory is unreliable is that your brain is constantly rewriting your memories, Dobelli explains. This is called falsification of history. As your opinions and worldview change over time, your brain alters the details of your memories, making you remember the past in a way that better matches your current opinions and worldview.
(Shortform note: Your brain rewrites memories in this way to be helpful: By updating the information, your memories become more relevant to the current moment and your current decisions. However, rewriting memories also means you become overconfident in your beliefs: When you think you’ve always held the same beliefs, you won’t feel the need to challenge them.)
The Primacy and Recency Effects
Your memory is also influenced by the order in which you receive information and how much time has passed since you received said information. According to Dobelli, the first information you receive is initially easier to remember than information introduced later. This is called the primacy effect. However, this only works for a short time, as the information eventually leaves your short-term memory. After that, whatever information you heard most recently is easier to remember. This is the recency effect.
(Shortform note: How do these effects work? In the case of the primacy effect, when you learn a piece of information early, your brain has more time to repeat it. This keeps it in your short-term memory for longer, until it can be transferred to long-term memory. As for the recency effect, when you learned a piece of information recently, the information is still in your short-term memory and so is easy to recall. You can manipulate these tendencies by memorizing important information first to trigger the primacy effect and reviewing information before you need it to trigger the recency effect.)
You Misinterpret Cause and Effect
In this section, we’ll cover how misinterpreting cause and effect damages your judgment. According to Dobelli, humans struggle to interpret cause and effect because they confuse correlation and causation. When two events coincide, people assume there’s a causal relationship between the two of them, even when there’s not.
(Shortform note: How do people make these mistaken links? They take their knowledge of the effect and look for any similar events that might point to a cause, regardless of the likelihood of that similar event actually being the cause. In other words, they look for possible correlations between the events and mistake this for one causing the other.)
Association Bias
One type of misrepresentation of cause and effect is association bias, or the brain’s tendency to make connections where none exist. Dobelli says this misrepresents cause and effect by forming false knowledge, where you falsely causally connect two unrelated things.
Superstitions form this way, Dobelli explains. For example, say you bring rainboots when camping, and the weather is perfect. The next time you go camping, you leave the rainboots behind and the weather is awful. The next time you bring them, the weather is wonderful again. After a few of these experiences, your brain connects the boots and good weather, even though it’s just a coincidence that the weather improved when you brought the boots.
(Shortform note: Why does association bias occur? Dobelli doesn’t say, but others argue that association bias is a defense mechanism: Making connections helps you form “protective frames.” These are practices or support systems that let you evaluate risk (for example, the risk of it raining when you go camping). In our example, the brain mistakenly created a frame in which the presence of rainboots reduces the risk of rain.)
The Fallacy of the Single Cause
Another way people misrepresent cause and effect is by oversimplifying: To make a simple pattern of cause and effect, people simplify to a single cause. This mindset is dangerous because everything is affected by a complex web of causes, Dobelli states. There’s never a single cause for complex effects like crime or success. (Shortform note: Problematically, if you simplify to a single cause, you’ll also simplify to a single solution. For example, if you believe high illness rates are solely due to unaffordable healthcare, you’ll work only to make healthcare affordable. Your singular focus means you don’t realize that other factors like safe housing and income must be addressed too.)
You Struggle to Understand Probability and Predictions
The next set of fallacies we’ll cover revolves around probability and predictions. Dobelli says people hate uncertainty and try to predict future events to alleviate that uncertainty. However, to make accurate predictions, you must understand probability, which humans struggle with. Thus, people’s predictions are usually inaccurate.
(Shortform note: Even though humans are proven to struggle with probability, and predictions are notoriously unreliable, people still make a living estimating probability and making predictions. This is a form of authority bias: You assume that if the person is making a prediction, they must have based said prediction on experience. However, no matter how knowledgeable the person is, they’ll struggle to process the information needed to make accurate predictions.)
Neglect of Probability
According to Dobelli, people struggle to make good decisions because they neglect to consider the probability or risk involved in those decisions. Logically, they should choose the option with the highest probability of going well for them and the lowest risk of going badly. However, people instead choose the option that will have the biggest positive impact on them if it occurs, regardless of how likely it is to occur.
(Shortform note: A type of neglect that Dobelli doesn’t cover is denominator neglect, which Daniel Kahneman describes in Thinking, Fast and Slow. If probability is a fraction, with the situation you’re evaluating as the numerator and the total number of possibilities as the denominator, you’ll base your judgment of risk solely on the numerator and ignore the denominator. This means you’ll regularly misinterpret probability, since the numerator depends on the denominator to accurately show probability.)
Hindsight Bias
The next fallacy we’ll cover is hindsight bias. Dobelli says hindsight bias makes past events seem like they should’ve been easily predictable. People see an obvious pattern of circumstances that led to a past event occurring, and they think people should have noticed that pattern and predicted the event. At the time, though, the pattern wasn’t clear, so people couldn’t use it to predict the event. It’s only with hindsight that the pattern becomes clear.
Hindsight bias encourages overconfidence, Dobelli says. You think you’re good at detecting patterns when really, you’re not: You’re only seeing them because of hindsight. You thus fail when trying to apply these pattern-spotting “skills” to predicting the future.
(Shortform note: Past events seem obvious because of how your brain predicts: When shown two possibilities, your brain creates reasons why both are possible. However, once Possibility A is proven, your brain doesn’t need to retain information about Possibility B. It forgets that information, making you believe Possibility A was obvious all along. This altered memory also creates overconfidence. You forget any prior uncertainty or incorrect predictions, which reinforces your overconfidence about your pattern-finding and prediction abilities.)
You Value Things for Arbitrary Reasons
In this final section, we’ll cover fallacies that affect how you value things. According to Dobelli, humans tend to put value in a person, situation, or item for arbitrary and illogical reasons.
The Endowment Effect
One illogical shift in your valuation of an item is that when you own an item, you subconsciously overinflate its value simply because it’s yours, Dobelli explains. This is called the endowment effect.
(Shortform note: This effect stems from loss aversion. Once something is in your possession, you fear losing it, which makes you value the item more. You can avoid this fallacy by avoiding personal connections to items: However, doing so may harm your well-being. Valued belongings become an extension of your identity and a way to express your personality, and preventing those connections from forming can make you feel stifled and unable to be yourself.)
Liking Bias
Liking bias also affects how you value people, specifically. The more you like someone, the more value you put on their opinions and desires, Dobelli says. This means you’re more likely to do something for an individual you like, even if doing so goes against your own interests.
(Shortform note: Dobelli doesn’t say why liking someone makes you value them more. Some experts say you value people you like more because when you like someone, you form an alliance with them. Having a common goal (friendship) unites you and the other person, making you more likely to value them and fulfill their desires.)
The Sunk Cost Fallacy
Another error in thinking that affects how you value things is the sunk cost fallacy. According to Dobelli, the more time, effort, or resources you invest in something, the higher you value that thing. You’ll also be more resistant to parting with it, even if keeping it means losing more time, effort, or resources in the future.
(Shortform note: This fallacy stems from a fear of waste: Most people try not to waste time, money, or effort, and letting go of something you’ve invested resources in feels like wasting those resources. While this is technically true—it is a waste of time, money, or effort—continuing to invest resources only creates more waste.)
You can overcome this fallacy by focusing on whether something is serving you in the present and will continue to do so in the future, rather than focusing on what you’ve invested in the past. (Shortform note: Dobelli’s suggestion to focus on the future doesn’t mean ignoring the past: Consider the past to make good decisions based on all the data you’ve collected, but don’t let past effort stop you from moving on.)
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Rolf Dobelli's "The Art of Thinking Clearly" at Shortform .
Here's what you'll find in our full The Art of Thinking Clearly summary :
- A detailed look at the most common logical fallacies that inhibit decision-making
- How to recognize and overcome these fallacies to make better decisions
- Why you value things for arbitrary reasons