PDF Summary:The Art of Thinking Clearly, by

Book Summary: Learn the key points in minutes.

Below is a preview of the Shortform book summary of The Art of Thinking Clearly by Rolf Dobelli. Read the full comprehensive summary at Shortform.

1-Page PDF Summary of The Art of Thinking Clearly

In The Art of Thinking Clearly, Rolf Dobelli breaks down the most common logical fallacies that inhibit decision-making, including confirmation bias, social proof, and hindsight bias. Dobelli aims to help people recognize and overcome these fallacies so they can make better decisions.

In this guide, we’ll explore Dobelli’s main fallacies, including those caused by humanity’s past as hunter-gatherers and those caused by other sources such as misinterpretation of cause and effect. Along the way, we’ll compare and contrast Dobelli’s ideas with other experts on logical thinking, such as Nassim Nicholas Taleb and Daniel Kahneman. We’ll also provide concrete steps to overcome illogical thinking and explore why logical fallacies occur.

(continued)...

(Shortform note: Some people argue that people’s difficulty with complex math concepts is a result of how math is viewed. People internalize the idea that math is difficult and those who struggle with the concepts stop trying to understand them. If math was treated like a language, which takes practice but can be learned by anyone, people would learn complex math easier.)

Here are some situations in which struggling with math negatively impacts your decisions:

The Distribution of Averages

Averages are one of the complex math concepts that your brain isn't evolutionarily prepared for, Dobelli explains. One of the biggest pitfalls when working with averages is ignoring the distribution: the original set of numbers used to calculate the average. Without knowing the distribution, averages are misleading because they don’t show the outliers: the extremes at either end of the distribution that drastically change the average. To get a true average, these outliers must be removed, Dobelli says. This isn’t instinctive, but it’s important for modern life because outliers are increasingly common.

Averages and Scalable Events

Dobelli’s discussion of distribution and outliers finds parallels in Nassim Nicholas Taleb’s The Black Swan. Taleb sorts events into two categories: scalable and non-scalable. Scalable events have no defined limits, while non-scalable events have defined limits. (Taleb notes that Black Swan events—events that are unpredictable yet highly influential—occur solely in scalable situations.)

Most natural events are non-scalable. For example, there’s a defined limit to how much weight a human can lift; strength and weakness revolve around an average, and there are few outliers in that average’s distribution.

On the other hand, many man-made situations and ideas are scalable, with outliers in their averages’ distribution. There’s no upper limit to wealth, for example, which allows for the existence of billionaires—outliers in the distribution of global wealth. The presence of a single billionaire can significantly raise the average wealth of a town, making this average misleading—most people will earn well below the skewed average. Thus, when dealing with scalable situations, understanding averages and their distribution is important to gain an accurate picture of the situation.

Self-Selection Bias

Statistics is another area of math that you’re not evolutionarily primed for, Dobelli says. One statistical error is self-selection bias, in which the nature of the participants in a study influences its outcome. Specifically, people only join studies they’re comfortable with responding to, which alters your data, Dobelli says. Those who might provide embarrassing or somehow “undesirable” responses simply won’t take part, narrowing your study’s scope and skewing the results.

(Shortform note: The only way to completely eliminate these problems is studying people who don’t know they’re being studied. However, researchers must receive consent from participants, so this isn’t practical. That said, you can limit self-selection bias. Most studies do this by collecting demographic information: Researchers look for patterns in the demographics of people who chose to participate and alter how they weigh the responses to reduce self-selection bias. Specifically, they give greater weight to results from those less likely to self-select, and lesser weight to results from those likely to self-select.)

Your Memory Is Not as Reliable as You Think It Is

Next, we’ll cover fallacies related to memory. People believe their memories are untouchable, stored away and recalled when needed in perfect condition. However, this isn't the case, Dobelli warns. Your memory is affected by your feelings, opinions, and situation.

(Shortform note: Your memories are affected in these ways at several points: First, whatever you were feeling in the moment is tangled up with the actual situation in your memory; later, every time you remember the situation, your current mental state further alters your memories. Thus, the more you remember a situation, the more distorted the memory becomes.)

In this section, we’ll look at some of the ways in which your memory is unreliable.

Falsification of History

The main reason your memory is unreliable is that your brain is constantly rewriting your memories, Dobelli explains. This is called falsification of history. As your opinions and worldview change over time, your brain alters the details of your memories, making you remember the past in a way that better matches your current opinions and worldview.

(Shortform note: Your brain rewrites memories in this way to be helpful: By updating the information, your memories become more relevant to the current moment and your current decisions. However, rewriting memories also means you become overconfident in your beliefs: When you think you’ve always held the same beliefs, you won’t feel the need to challenge them.)

The Primacy and Recency Effects

Your memory is also influenced by the order in which you receive information and how much time has passed since you received said information. According to Dobelli, the first information you receive is initially easier to remember than information introduced later. This is called the primacy effect. However, this only works for a short time, as the information eventually leaves your short-term memory. After that, whatever information you heard most recently is easier to remember. This is the recency effect.

(Shortform note: How do these effects work? In the case of the primacy effect, when you learn a piece of information early, your brain has more time to repeat it. This keeps it in your short-term memory for longer, until it can be transferred to long-term memory. As for the recency effect, when you learned a piece of information recently, the information is still in your short-term memory and so is easy to recall. You can manipulate these tendencies by memorizing important information first to trigger the primacy effect and reviewing information before you need it to trigger the recency effect.)

You Misinterpret Cause and Effect

In this section, we’ll cover how misinterpreting cause and effect damages your judgment. According to Dobelli, humans struggle to interpret cause and effect because they confuse correlation and causation. When two events coincide, people assume there’s a causal relationship between the two of them, even when there’s not.

(Shortform note: How do people make these mistaken links? They take their knowledge of the effect and look for any similar events that might point to a cause, regardless of the likelihood of that similar event actually being the cause. In other words, they look for possible correlations between the events and mistake this for one causing the other.)

Association Bias

One type of misrepresentation of cause and effect is association bias, or the brain’s tendency to make connections where none exist. Dobelli says this misrepresents cause and effect by forming false knowledge, where you falsely causally connect two unrelated things.

Superstitions form this way, Dobelli explains. For example, say you bring rainboots when camping, and the weather is perfect. The next time you go camping, you leave the rainboots behind and the weather is awful. The next time you bring them, the weather is wonderful again. After a few of these experiences, your brain connects the boots and good weather, even though it’s just a coincidence that the weather improved when you brought the boots.

(Shortform note: Why does association bias occur? Dobelli doesn’t say, but others argue that association bias is a defense mechanism: Making connections helps you form “protective frames.” These are practices or support systems that let you evaluate risk (for example, the risk of it raining when you go camping). In our example, the brain mistakenly created a frame in which the presence of rainboots reduces the risk of rain.)

The Fallacy of the Single Cause

Another way people misrepresent cause and effect is by oversimplifying: To make a simple pattern of cause and effect, people simplify to a single cause. This mindset is dangerous because everything is affected by a complex web of causes, Dobelli states. There’s never a single cause for complex effects like crime or success. (Shortform note: Problematically, if you simplify to a single cause, you’ll also simplify to a single solution. For example, if you believe high illness rates are solely due to unaffordable healthcare, you’ll work only to make healthcare affordable. Your singular focus means you don’t realize that other factors like safe housing and income must be addressed too.)

You Struggle to Understand Probability and Predictions

The next set of fallacies we’ll cover revolves around probability and predictions. Dobelli says people hate uncertainty and try to predict future events to alleviate that uncertainty. However, to make accurate predictions, you must understand probability, which humans struggle with. Thus, people’s predictions are usually inaccurate.

(Shortform note: Even though humans are proven to struggle with probability, and predictions are notoriously unreliable, people still make a living estimating probability and making predictions. This is a form of authority bias: You assume that if the person is making a prediction, they must have based said prediction on experience. However, no matter how knowledgeable the person is, they’ll struggle to process the information needed to make accurate predictions.)

In this section, we’ll cover ways people misunderstand probability and make inaccurate predictions.

Neglect of Probability

According to Dobelli, people struggle to make good decisions because they neglect to consider the probability or risk involved in those decisions. Logically, they should choose the option with the highest probability of going well for them and the lowest risk of going badly. However, people instead choose the option that will have the biggest positive impact on them if it occurs, regardless of how likely it is to occur.

(Shortform note: A type of neglect that Dobelli doesn’t cover is denominator neglect, which Daniel Kahneman describes in Thinking, Fast and Slow. If probability is a fraction, with the situation you’re evaluating as the numerator and the total number of possibilities as the denominator, you’ll base your judgment of risk solely on the numerator and ignore the denominator. This means you’ll regularly misinterpret probability, since the numerator depends on the denominator to accurately show probability.)

Hindsight Bias

The next fallacy we’ll cover is hindsight bias. Dobelli says hindsight bias makes past events seem like they should’ve been easily predictable. People see an obvious pattern of circumstances that led to a past event occurring, and they think people should have noticed that pattern and predicted the event. At the time, though, the pattern wasn’t clear, so people couldn’t use it to predict the event. It’s only with hindsight that the pattern becomes clear.

Hindsight bias encourages overconfidence, Dobelli says. You think you’re good at detecting patterns when really, you’re not: You’re only seeing them because of hindsight. You thus fail when trying to apply these pattern-spotting “skills” to predicting the future.

(Shortform note: Past events seem obvious because of how your brain predicts: When shown two possibilities, your brain creates reasons why both are possible. However, once Possibility A is proven, your brain doesn’t need to retain information about Possibility B. It forgets that information, making you believe Possibility A was obvious all along. This altered memory also creates overconfidence. You forget any prior uncertainty or incorrect predictions, which reinforces your overconfidence about your pattern-finding and prediction abilities.)

You Value Things for Arbitrary Reasons

In this final section, we’ll cover fallacies that affect how you value things. According to Dobelli, humans tend to put value in a person, situation, or item for arbitrary and illogical reasons.

The Endowment Effect

One illogical shift in your valuation of an item is that when you own an item, you subconsciously overinflate its value simply because it's yours, Dobelli explains. This is called the endowment effect.

(Shortform note: This effect stems from loss aversion. Once something is in your possession, you fear losing it, which makes you value the item more. You can avoid this fallacy by avoiding personal connections to items: However, doing so may harm your well-being. Valued belongings become an extension of your identity and a way to express your personality, and preventing those connections from forming can make you feel stifled and unable to be yourself.)

Liking Bias

Liking bias also affects how you value people, specifically. The more you like someone, the more value you put on their opinions and desires, Dobelli says. This means you’re more likely to do something for an individual you like, even if doing so goes against your own interests.

(Shortform note: Dobelli doesn’t say why liking someone makes you value them more. Some experts say you value people you like more because when you like someone, you form an alliance with them. Having a common goal (friendship) unites you and the other person, making you more likely to value them and fulfill their desires.)

The Sunk Cost Fallacy

Another error in thinking that affects how you value things is the sunk cost fallacy. According to Dobelli, the more time, effort, or resources you invest in something, the higher you value that thing. You'll also be more resistant to parting with it, even if keeping it means losing more time, effort, or resources in the future.

(Shortform note: This fallacy stems from a fear of waste: Most people try not to waste time, money, or effort, and letting go of something you’ve invested resources in feels like wasting those resources. While this is technically true—it is a waste of time, money, or effort—continuing to invest resources only creates more waste.)

You can overcome this fallacy by focusing on whether something is serving you in the present and will continue to do so in the future, rather than focusing on what you’ve invested in the past. (Shortform note: Dobelli’s suggestion to focus on the future doesn’t mean ignoring the past: Consider the past to make good decisions based on all the data you’ve collected, but don’t let past effort stop you from moving on.)

Want to learn the rest of The Art of Thinking Clearly in 21 minutes?

Unlock the full book summary of The Art of Thinking Clearly by signing up for Shortform.

Shortform summaries help you learn 10x faster by:

  • Being 100% comprehensive: you learn the most important points in the book
  • Cutting out the fluff: you don't spend your time wondering what the author's point is.
  • Interactive exercises: apply the book's ideas to your own life with our educators' guidance.

Here's a preview of the rest of Shortform's The Art of Thinking Clearly PDF summary:

PDF Summary Shortform Introduction

...

Connect With Rolf Dobelli:

The Book’s Publication

The Art of Thinking Clearly was originally published in German (under the name Die Kunst des klaren Denkens) in 2011 by Carl Hanser Verlag. It was translated into English in 2013 through Scepter, an imprint of Hodder & Stoughton. In 2014, a second edition was published through Harper Paperbacks, an imprint of HarperCollins, including an extensive bibliography and citations. For this guide, we’re using the 2014 edition.

The Book’s Context

Historical Context

The Art of Thinking Clearly was published in 2011, on the heels of an unprecedented period of expansion in psychological study. Since the 1980s, interest in psychology had grown steadily. The founding of groups like the [Association for Psychological...

PDF Summary Part 1: Evolutionary Fallacies | Chapter 1: You Want to Belong to the Group

...

But we’re not hunter-gatherers any more, so shouldn’t humans still be evolving to better fit their modern environment? Why do these logical fallacies persist if natural selection exists? There are three reasons:

  • Humanity is too spread out. When humans lived in small groups, beneficial genetic mutations could gradually expand through the group and then beyond. In the modern world, the human race is too large and spread out for genetic mutations to easily spread.

  • There’s a lack of environmental pressure. While global warming affects weather patterns, it isn’t extreme enough to trigger evolution. If a global disaster struck that drastically and permanently changed the weather or food supply, natural selection would become relevant again as humans would be forced to evolve to survive.

  • It hasn’t been long enough for us to evolve again. Humans were hunter-gatherers for almost 4 million years. It’s only...

PDF Summary Chapter 2: You Pay Attention to the Wrong Things

...

(Shortform note: Businesses try to draw popular attention to their products by standing out among their competitors. This is increasingly difficult as advertisements infiltrate all of modern life and people become adept at ignoring them. However, there’s a form of popular attention that is very common in modern life: star reviews. Whether it’s movies, restaurants, or an online purchase, people rely on this modern version of word of mouth when making decisions.)

Dobelli recommends avoiding the salience effect by ignoring the easiest or most obvious information and looking for the less flashy causes behind situations.

How to Overcome the Salience Effect

Dobelli’s suggestion to look for the less obvious causes behind situations can be difficult to put into action, since you’re competing directly against the powerful salience effect. Here are a few tips to make this easier:

1. Look for the less obvious causes of a situation by [asking a lot of “why”s, slowing down to consider the answers to these “why”s, and being critical about how much you really understand about the potential...

What Our Readers Say

This is the best summary of The Art of Thinking Clearly I've ever read. I learned all the main points in just 20 minutes.

Learn more about our summaries →

PDF Summary Chapter 3: You’re Using the Wrong Kind of Thinking

...

That said, thinking instinctively may feel uncomfortable to you—Gladwell notes that people prefer instinctive thinking to logic, which makes it hard to engage your logical thinking. In Thinking Fast and Slow, Daniel Kahneman notes that you can encourage logical thinking by increasing the difficulty of your task. For example, if you’re reading an article, making the font smaller increases the effort you put into the task, triggering your logical thinking and helping you analyze the article.

In this chapter, we’ll look at the following logical errors that come from using the wrong kind of thinking:

  • The conjunction fallacy
  • The affect heuristic
  • Hyperbolic discounting

The Conjunction Fallacy

The first incorrect use of thinking we’ll explore is the conjunction fallacy: the human tendency to prefer a plausible story to a probable one. In other words, Dobelli explains, when a story makes sense to you, you’re likely to believe it even if the true probability of it occurring is low.

For example, consider a girl named Katrina who...

PDF Summary Chapter 4: You Struggle to Understand Complex Math

...

For example, a waiter could serve an average of 20 tables a day. Your brain instinctively limits the range of possible tables waited per day to a range close to 20, such as 15 to 25. However, there might be slow days where the waiter only serves 5 tables, or busy days where they serve 40. The average hides these outliers, so if you ignore the distribution, you don’t have a true idea of the waiter’s workload.

To get a true idea of the workload, you must remove the outliers, Dobelli says. This isn’t instinctive, but it’s important for modern life because outliers are increasingly common.

Averages and Scalable Events

Dobelli’s discussion of distribution and outliers finds parallels in Nassim Nicholas Taleb’s The Black Swan. Taleb sorts events into two categories: scalable and non-scalable. Scalable events have no defined limits, while non-scalable events have defined limits. (Taleb notes that Black Swan events—events that are unpredictable yet highly influential—occur solely in scalable situations.)

Most natural events are non-scalable. For example, there’s a defined limit to how much weight...

PDF Summary Chapter 5: Miscellaneous Evolutionary Fallacies

...

Action Bias

Another miscellaneous evolutionary fallacy is action bias: the tendency to take action rather than waiting for a better opportunity or more information, Dobelli explains. For early humans, delaying action could mean death (for instance, if a predator approached). While this is much less often the case for modern humans, we still instinctively want to act in all situations. However, this instinct causes problems in today’s complex world, when impulsive actions are more likely to cause problems than waiting and thinking things through. For example, impulsively deciding to invest in a fund without waiting and researching its likelihood of growth may lead to you losing money.

While the need for patience has grown, society's tolerance of it hasn’t, Dobelli says. People praise those who take action, even if that action was reckless or hasty, while accusing those who choose to wait and gather information of cowardice. (Shortform note: Others argue that Dobelli is only partially right. Studies show that people praise [leaders who quickly make positive decisions (like promoting someone) and see them as more trustworthy than those who...

PDF Summary Part 2: Non-Evolutionary Fallacies | Chapter 6: You Misinterpret Cause and Effect

...

Dobelli suggests using your knowledge of the result illusion to make realistic goals. If you confuse the cause (swimmers’ body type) with the effect (they’re good swimmers), you might set unrealistic goals, like swimming to gain a different body type. If you realize the swimmers were born with that body type, you can set healthier goals for exercising that fit your own body type.

The Result Illusion: Nature vs. Nurture

The result illusion is part of the nature vs. nurture debate: It’s hard to determine if people's genetics or upbringing dictate their characteristics. When you fall for the result illusion, you confuse the person’s nature—natural traits—with how they’ve been nurtured—traits they gained through their activities.

Dobelli implies that people’s traits are primarily assigned by nature, and thus that nurturing doesn’t have as large a role. However, others argue that the divide of importance between nature and nurture is more even: While it's true that some people do naturally have an optimized body for swimming, [you can also gain some of those traits by...

PDF Summary Chapter 7: Your Memory Is Not as Reliable as You Think It Is

...

Even people's strongest memories, usually made in joyous or traumatic situations, are dramatically altered as time goes by, Dobelli adds. (Shortform note: Many people believe these “flashbulb” moments remain untouched and are represented totally accurately in your memory. However, traumatic memories are often less accurate than other kinds of memories, not more accurate. Sometimes, traumatic memories are almost entirely fabricated because your ability to form memories is hampered in traumatic situations, not increased.)

Cognitive Dissonance

Falsification of history is often triggered by cognitive dissonance, or discomfort when your beliefs or desires conflict with your actions, Dobelli says. To alleviate this discomfort, **you deny or rationalize your actions until your memories of the situation change: You deny that...

Why are Shortform Summaries the Best?

We're the most efficient way to learn the most useful ideas from a book.

Cuts Out the Fluff

Ever feel a book rambles on, giving anecdotes that aren't useful? Often get frustrated by an author who doesn't get to the point?

We cut out the fluff, keeping only the most useful examples and ideas. We also re-organize books for clarity, putting the most important principles first, so you can learn faster.

Always Comprehensive

Other summaries give you just a highlight of some of the ideas in a book. We find these too vague to be satisfying.

At Shortform, we want to cover every point worth knowing in the book. Learn nuances, key examples, and critical details on how to apply the ideas.

3 Different Levels of Detail

You want different levels of detail at different times. That's why every book is summarized in three lengths:

1) Paragraph to get the gist
2) 1-page summary, to get the main takeaways
3) Full comprehensive summary and analysis, containing every useful point and example

PDF Summary Chapter 8: You Struggle to Understand Probability and Predictions

...

(Shortform note: Dobelli claims that investors frequently invest solely based on the possible yield of an investment, rather than its risk, and uses this example to prove the validity of his claims regarding neglect of probability. However, this example isn’t accurate: Investors use various risk management strategies when investing, including analyzing the historical value of the stock to determine the risk of its value changing in the future.)

Denominator Neglect

A type of neglect that Dobelli doesn’t cover is denominator neglect, which Daniel Kahneman describes in Thinking, Fast and Slow. If probability is a fraction, with the situation you’re evaluating on top and the total number of possibilities on bottom, people base their judgment of risk solely on the top number, or numerator, and ignore the bottom number, or denominator. This...

PDF Summary Chapter 9: You Value Things for Arbitrary Reasons

...

Overcoming the Endowment Effect: Think Like a Vendor?

While Dobelli believes the endowment effect influences anyone who owns an item, others argue that it only activates when you’re going to use the item. If you have an item for the sole purpose of exchanging it, the endowment effect doesn’t apply. This is how vendors are immune to the endowment effect and can sell goods at fair prices.

Since accurately valuing and selling items is difficult only when they’re personal items, if you avoid attachment by viewing your belongings like a vendor views their stock, you’ll have a more accurate idea of their value. However, avoiding connection to items in this way may harm your well-being. Valued belongings become an extension of your identity and a way to express your personality, and preventing those connections from forming can make you feel stifled and unable to be yourself.

Liking Bias

Liking bias also affects how you value people, specifically. **The more you like someone, the more value you put on...

PDF Summary Chapter 10: You Have Too Much of a Good Thing

...

3. Having too many options inspires uncertainty. After making your decision, you’ll be unhappy because you’ll never be sure it was the right choice, Dobelli says. With so many other options available, how do you know a different dishwasher wasn’t the better choice after all?

(Shortform note: This uncertainty stems from buyer’s remorse. You narrow your choice criteria, as discussed above, but you don’t forget about all the other options and features you’re sacrificing. You ignore those options to make your choice easier, but once you’ve made your decision, the knowledge of what you ignored returns, inspiring remorse and uncertainty.)

How can you avoid being overwhelmed by too many options? Dobelli suggests writing down the qualities that are important to you before evaluating your options. This stops you...

PDF Summary Chapter 11: Miscellaneous Non-Evolutionary Fallacies

...

Omission Bias

Another non-evolutionary fallacy is omission bias: When both acting and not acting have negative results, you’re prone to not acting. This bias causes problems when acting could at least mitigate the negative results, Dobelli explains. In other words, both Option A (active) and Option P (passive) cause negative result X to occur. Even though taking Option A means X will be less serious, you’ll choose Option P because of omission bias.

For example, if Option A was "act to close a school" and Option P was "passively let the school slowly fail," many would choose Option P. This is illogical, as letting the school fail is a waste of time and money, the school’s educational standards will drop over time, and the students will probably suffer more than if you closed the school. If you took Option A, while the students would temporarily be stressed and their education disrupted, they’d quickly find a new school, likely one with no chance of closing and more consistent educational standards.

Omission Bias and Morality

Why does omission bias occur? Dobelli doesn’t say, but others argue it’s because [you feel guilty when your action causes negative...