a woman with two thought bubbles above, one with a green checkmark and one with a red X, illustrates right and wrong thinking

Why does being right feel so good? How might being wrong actually be a survival skill?

In her book Being Wrong, Kathryn Schulz explores the fascinating world of right and wrong thinking. She explains how our beliefs are formed and why we cling to them, even when they’re incorrect. She provides insights into the evolutionary benefits of both being right and being wrong.

Keep reading to discover how our brains process uncertainty and why admitting mistakes is so challenging.

Right and Wrong Thinking

Schulz doesn’t define right and wrong thinking in terms of “truth,” instead focusing on the experience of changing your mind from one idea to another. This is because the traditional definition of “being wrong”—that you believe something is true when it isn’t—implies that there’s an underlying “truth” that every belief can be judged against. While this may apply to some situations, such as misremembering where you left your keys, it doesn’t hold for all situations, such as matters of personal taste or opinion. In these instances, we act as if opinions can be wrong, such as whether pickles taste good or bad, even if there’s no objective truth either way.

(Shortform note: Often, we’ll judge the “truth” of a subjective belief—such as whether a goal is achievable—by the outcome of that belief. In some cases, the act of belief itself can become a self-fulfilling truth. In his biography of Steve Jobs, Walter Isaacson writes that the Apple cofounder exuded what his employees dubbed a “reality distortion field.” Jobs would set ambitious project goals and insist they were attainable even when his workers insisted he was wrong. In many cases, Jobs was proven right by his subsequent success, but had he not pushed his beliefs so strongly, he would have been proven “wrong”—as Schulz defines the term—and been forced to pivot, as he did after several of his failed business ventures.)

Right thinking and wrong thinking are both essential human aspects, says Schulz. Being “right”—assuming that beliefs are true and acting on them accordingly—is a survival tool we inherited from our prehistoric peers. After all, if you’re in the savannah and hear a noise you believe to be a lion, it’s better to act on that belief without question than to wait and study the matter in depth, increasing your risk of turning into a snack. As a result, being right feels good, and evolution rewards that feeling. However, Schulz argues that our ability to be wrong is also a survival skill in that it lets us imagine a different world than the one we live in—a world that, while technically “wrong,” helps us look past our limited perceptions to solve problems. 

For example, imagine that your car—which you thought was running fine—breaks down on the highway. At once, your mind starts generating theories about why your car malfunctioned. Most of these thoughts will be just as incorrect as your prior belief that your car was in good order, but they let your mind work on solutions to various contingencies until you have more data. This same impulse let our ancient ancestors imagine better hunting ground past the horizon, whether or not it was actually there, driving the human race to spread across the globe.

(Shortform note: Schulz equates imagination with error since whatever you imagine is, by definition, a mental model that exists only outside of objective reality. However, some psychologists prefer to characterize imaginative problem-solving as a form of play. This is especially evident in children, who envision scenarios from multiple points of view while adopting various roles to imitate adult behavior. This shift in perspective not only opens your mind to new possibilities, but also helps to encourage empathy by imagining life beyond your direct experience. What you imagine may certainly be “wrong” in that you can’t exactly model what you don’t know first-hand, but it’s a mind-broadening exercise nonetheless.)

We’re Programmed to Believe

Research into how beliefs are formed sheds even more light on why it’s so easy to believe what isn’t true. The key lies in how our brains process uncertainty. In Thinking in Bets, poker expert Annie Duke explains that life is so full of randomness and uncertainty that most of the decisions we make aren’t right or wrong, but exist on a spectrum from poor to pretty good. Therefore, in the name of efficiency and fast decision-making, evolution has tuned our brains to prioritize reducing uncertainty. This increases our odds of making good judgments quickly rather than dithering over an issue (such as whether that’s really a lion stalking you) to the point that being right puts your survival at risk.

Duke goes on to explain that because our ancestors’ survival relied on trusting their senses, the belief-formation process in our brains is predisposed to accept things as fact rather than to doubt them. Studies have shown that our minds process information as if it’s true even if it’s explicitly presented as false, especially in times of stress. And once we’ve accepted one belief as true, we use it as a framework to process and accept other potentially false beliefs, changing our very perception. The science, therefore, backs up Schulz’s premise that holding mistaken beliefs is baked into our nature—it’s there in the wiring of our brains.

Since we like to cling to the feeling that we’re right, the conflicts we experience aren’t between “right” and “wrong,” but between opposing views of “right,” writes Schulz. Our society doesn’t even afford us a healthy common language for admitting to mistakes without associated shame. Instead, we prioritize being right above all else while happily pointing out the mistakes of others. When we’re forced to confront our errors, we’ll either shift the blame or disassociate ourselves from our wrongness with the classic line, “Mistakes were made.” This robs us of the lessons we might learn from opening ourselves to the chance that we’re wrong, thereby harming relationships between people, cultures, religions, and nations.

(Shortform note: Part of the reason we don’t admit we’re wrong is that having to do so induces a form of mental tension called cognitive dissonance. In Mistakes Were Made (But Not By Me), Carol Tavris and Elliot Aronson define cognitive dissonance as the uncomfortable state of holding two or more contradictory beliefs at once. In the case of admitting that you’re wrong, the dissonance comes from wanting to maintain your belief in yourself as a rational person while also believing in your capacity for error. According to Tavris and Aronson, we often try to defuse this mental state by justifying our beliefs to ourselves in ways that drive us apart from other people whose beliefs and ideas don’t align with our own.)

Did We Inherit Right and Wrong Thinking as Survival Tools?

Elizabeth Whitworth

Elizabeth has a lifelong love of books. She devours nonfiction, especially in the areas of history, theology, and philosophy. A switch to audiobooks has kindled her enjoyment of well-narrated fiction, particularly Victorian and early 20th-century works. She appreciates idea-driven books—and a classic murder mystery now and then. Elizabeth has a blog and is writing a book about the beginning and the end of suffering.

Leave a Reply

Your email address will not be published. Required fields are marked *