Why do humans make mistakes? What are the root causes of our errors?
In her book Being Wrong, Kathryn Schulz explores several causes of human error. She explains how our brains process information and make decisions and details three main factors that contribute to our mistakes.
Read more to uncover some of the reasons behind our blunders.
Causes of Human Error
The fact that human error is practically a given implies that there’s something fundamentally wrong with how human beings perceive the world. That would be true if the purpose of the brain was to perfectly analyze information from your senses, but it’s not. Instead, the brain is optimized to make judgments quickly from limited data and choose the best behavior to promote our survival. As a result, our minds are built on heuristics—mental shortcuts to efficient, best-guess thinking based on limited information, mental models, and the collective judgment of whatever groups we’re a part of. According to Schulz, these tools make our minds amazingly efficient, but they’re also the loopholes through which we make mistakes. She discusses three such causes of human error that are helpful to understand.
Cause #1: Perception and Reason
The most basic and natural mistake that we make is to trust the evidence of our senses without question. The senses are the mind’s only window on the world, but we underestimate the degree to which that window is clouded by how the brain filters data through an unconscious interpretive process, followed by instinctive reasoning that doesn’t rely on strict rules of logic.
Schulz explains that our conscious minds don’t receive the “raw data” from our senses. Instead, the information we see, hear, and feel is touched up and processed by our unconscious nervous system, such as the way the brain’s visual cortex fills in the gaps between the still frames of a movie. Our brains are designed to fill in the gaps in any sensory data we receive. Doing so gives us a more cohesive awareness of our surroundings, which therefore aids our survival in the wild—but it also opens the door to error, particularly when the brain’s “best guess” to fill those gaps turns out to be wrong.
Our misperceptions are amplified when we use them to make decisions, a process which also relies on “best guess” logic more than we’d like to admit, writes Schulz. Rather than using careful logic to make decisions and judgments, our brains default to inductive reasoning—determining what’s likely to be true based on past experience. This mental shorthand allows for quick decisions that tend to be mostly right most of the time, but inductive reasoning also leads us into various cognitive traps, such as making overly broad and biased generalizations while ignoring information that doesn’t jibe with our beliefs. As with our sensory errors, our imprecise reasoning is a natural side effect of the processes that make our brains so efficient.
Cause #2: Belief and Imagination
Our potentially faulty judgments based on erroneous sensory data form the shaky foundations on which we build beliefs, giving us even more chances to be wrong. Our beliefs shape everything we do, and they’re so interwoven and interconnected that admitting one of them is wrong can threaten our entire framework of understanding. Schulz discusses how beliefs are formed and how easily we invent them on the fly with little or no evidence to back them up.
Beliefs are stories we tell about the world. We’re conscious of some, such as beliefs about money, but unconscious of others, such as which way is “down.” We cling to some beliefs very tightly, while others change more easily, depending on how important they are. Schulz says that we automatically form beliefs about every new thing or idea we encounter, because otherwise we wouldn’t have a way to determine how to act or predict what will happen. This belief formation is a two-pronged approach—part of your mind creates a story to explain what your senses tell you, while another part of your mind checks your story against further input from your senses. Either side of this process can break down, resulting in beliefs that are wrong.
An example in which your senses and storytelling contribute to faulty beliefs is that of a well-performed magic trick. In this instance, though, the limits of your senses are deliberately “hacked” by the magician. Magicians conceal the mechanics of the trick so that no matter how closely you watch, you’re deprived of essential information about it, just as Schulz asserts that we are in most situations we experience. Meanwhile, the magician provides a running commentary to shape the narrative forming in your mind—one that’s at odds with the reality of how the trick is actually performed. Even when you know that what you’re seeing isn’t possible, it’s hard to disbelieve the evidence of your eyes and the story the magician tells you.
Your senses can fail you, but the storytelling aspect of your brain goes wrong when you spin beliefs from sheer imagination. Schulz acknowledges that imagination is an evolutionary gift that lets us solve problems we’ve never faced before, but it goes wrong when we invent stories without any evidential grounding. We do this because our minds crave answers, so we feed them by making up theories. For instance, if your cat disappears and shows up again, you’ll automatically start guessing where it went. The trouble is that these guesses quickly solidify into firmly held beliefs. Admitting that you don’t know something is more uncomfortable than pretending that you do, hence the temptation to believe things too strongly.
Cause #3: Social Pressure to Believe
Being wrong doesn’t happen only on an individual level. We don’t form our beliefs on our own, and history has shown that large groups of people can all be wrong at once. Try as you might, there’s no way to avoid learning beliefs and behaviors from the people around you, and when you’re firmly embedded in a group, any fallacies in the thinking of that group are only reinforced by the strength of group identity.
“Think for yourself” is common advice, but unfortunately, you can’t do it. We all rely on other people’s knowledge—there’s too much in the world to learn on our own. The problem is telling whether or not someone else’s beliefs are worth sharing. Schulz argues that we generally don’t judge someone else’s beliefs on the merit of their ideas. Instead, we first decide if someone else is trustworthy—if they are, we accept their beliefs. This is a time-saving shortcut that lets us learn from teachers and parents, determine which news articles to read, and decide which opinion podcasts to listen to. However, this shortcut opens up a world of error because it multiplies our own faulty judgment by that of many others. Mistakes spread like a plague.
Schulz says that, in years past, we formed many of our beliefs based on the groups we were raised in, but, in the Information Age, we seek out and form groups based on shared beliefs. Group consensus is a powerful drug, and in a group based on common ideas, belief in those ideas is self-reinforcing, while any evidence against them is ignored by the strength of the group’s willful blindness. For instance, consider how strongly groups of music fans react to criticism of the artists they enjoy, even when those artists’ work is in decline.
When social status and group membership are defined by your agreement with certain beliefs, then any dissent is an attack on the group and can be punished by shunning, expulsion, or worse. Human beings are social animals, and it’s easier to go along with questionable ideas than lose your group status.