This article is an excerpt from the Shortform book guide to "Critical Thinking, Logic & Problem Solving" by Bigrocks Thinking. Shortform has the world's best summaries and analyses of books you should be reading.
Like this article? Sign up for a free trial here.
Do you tend to give a preference to the information you hear first? Do you give more weight to the ideas of people you like?
Gathering information is the first step in the critical thinking process. However, you should exercise caution. Some information has baggage—biases that distort reality.
Keep reading to learn about four common types of biases that can get in the way of the truth.
4 Common Types of Biases
The authors of Critical Thinking, Logic & Problem Solving recommend that you assess your source for any potential biases. Biases are errors in thought processing that result from generalizations the brain makes. These generalizations can help you make decisions more quickly, but they can also negatively impact your critical thinking. Biases are often impossible to avoid, but being aware of the biases in your sources can help you avoid them in your own thinking. The authors identify common types of biases you should be aware of.
(Shortform note: In Biased, Jennifer Eberhardt explains that biases can be particularly difficult to notice because some of them are ingrained in us from an early age and stay with us unconsciously. These biases affect what information we notice and prioritize and what information we automatically ignore. In assessing bias, don’t just look at the information being presented, but pay attention to what’s left unsaid as well, and look for sources that omit important information or perspectives. Additionally, sources may be biased if they stand to benefit in some way from convincing you of their claim, if they use a lot of subjective language or hyperbole, or if they don’t clearly cite their sources.)
The authors list several common types of biases, including the following four.
Bias #1: Anchoring Bias
Anchoring bias means being more likely to believe information that you hear first over the information you gather later. For example, say you’re a parent and your children are fighting with each other. You tell them to stop and then ask each of them, in turn, to explain why they were fighting. Anchoring bias would make you more likely to believe the first child’s explanation and more likely to doubt the second.
(Shortform note: Anchoring bias may result in part from the phenomenon of psychological priming. When we are exposed to an idea, that idea activates, or primes, parts of our brain, and those parts of the brain stay active as we process further information. This can affect our thinking by anchoring us to the first idea we heard and to the mental connections we drew from it. To avoid anchoring bias, think of counterarguments or alternative options and look for reasons why they might be better than the anchored information.)
Bias #2: Confirmation Bias
Confirmation bias is the tendency to give more credence to information that confirms what you already believe and dismiss information that isn’t in line with your beliefs.
(Shortform note: In The Art of Thinking Clearly, Rolf Dobelli explains that some people use confirmation bias to take advantage of others, like how fortune tellers use vague statements that people can interpret to be true in various ways. Such vague statements confirm ideas the listener already had and thus make them more likely to believe the fortune teller. To avoid confirmation bias, Dobelli suggests you note the different beliefs you hold and find information that refutes them so you can assess their accuracy.)
Bias #3: Halo Effect
The halo effect is when you have a positive view of someone or something, and this leads you to view their opinions or claims more positively.
(Shortform note: The opposite of the halo effect is the horn effect, where you make negative judgments about a person based on a negative trait they exhibit. The halo effect is sometimes also reversed, and you may find yourself making negative judgments about a person based on a positive trait they exhibit—like assuming that attractive people are unintelligent.)
Bias #4: The Dunning-Kruger Effect
The Dunning-Kruger effect is when you put too much faith in your own abilities.
(Shortform note: The opposite of the Dunning-Kruger effect is imposter syndrome, which is when a capable person underestimates their own abilities. To avoid both of these biases, objectively compare yourself and your work to others to see how your respective abilities, accomplishments, and levels of confidence match up. Additionally, be open to feedback from others and use it to improve your performance.)
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Bigrocks Thinking's "Critical Thinking, Logic & Problem Solving" at Shortform.
Here's what you'll find in our full Critical Thinking, Logic & Problem Solving summary:
- A step-by-step guide for improving critical thinking and problem-solving skills
- Tips for conducting better research and finding reliable resources
- How to improve your communication and storytelling skills