This article is an excerpt from the Shortform book guide to "The Righteous Mind" by Jonathan Haidt. Shortform has the world's best summaries and analyses of books you should be reading.
Like this article? Sign up for a free trial here .
How can caring about what others think of us make us better people? What power does this kind of pressure have?
In The Righteous Mind, social psychologist Jonathan Haidt argues that caring about what others think of us can cause us to make better moral judgments. When we’re concerned about our reputation, we behave better.
Read more to learn how caring about what others think of us can make us better people.
Why Do We Care so Much of What Others Think?
In Plato’s Republic, Glaucon—a Greek philosopher and Plato’s older brother—argues that it is the appearance of justice that keeps us just, rather than any actual justice. In contrast, Socrates argues we should always act based on reasoned justice. Socrates says that a truly just city has to have a philosopher-king who can divine right from wrong. Socrates is the hero of the book, but according to Haidt, it’s Glaucon who’s right.
Whether we realize it or think we should, we are prone to caring about what others think. We are constantly acting like intuitive politicians. We’re guided by intuition but concerned with justifying our actions so that others like and trust us. Appearance in social situations like a workplace matters more than reality.
When people know they’ll have to justify a decision—when they care about what others think—they’re more self-critical and willing to revise their beliefs when presented with different evidence. Essentially, this is because people respond to outside social pressures. We’re looking for confirmation from the group that we’re right.
How We Think Like Politicians
Ultimately, we care about what others think. There are five examples that indicate that morally, we think much more like a politician trying to win over constituents than a scientist in search of truth.
- We are fascinated by polling data (of ourselves): Experiments show that no matter how much someone says they don’t care what others think of them, their self-esteem will plummet when told that strangers don’t like them and will rise rapidly when told strangers do. On an unconscious level, we’re constantly measuring our social status. The intuitive part of the mind is caring about what others think of us, even if the rational part of the mind, isn’t.
- We all have a “press secretary,” constantly justifying everything: In other words, we all have confirmation bias and are constantly on the hunt, like a press secretary, for evidence that justifies our way of thinking. Simultaneously, we ignore anything that might challenge it. Research shows that people with higher IQs can generate more arguments to support a viewpoint, but only for their own side. As soon as the intuitive part of the mind leans in a direction, the rational part starts looking for reasons to explain it.
- We rationalize cheating and lying so well that we can convince ourselves we’re honest: Like politicians, when given the opportunity and plausible deniability, most people will cheat but still believe that they are virtuous. They cheat up to the point where they can no longer rationalize the cheating: In one study, when a cashier handed a subject more money than she was due, only 20% of the subjects corrected the mistake—because they were passive participants in the transaction, they could reconcile keeping the extra money with the belief that they were honest people. However, when the cashier asked if the amount was correct, 60% of people corrected the cashier’s mistake and gave the extra money back—in this case, it was harder to deny responsibility for the mistake because the cashier directly asked them about it.
- We can reason ourselves into any idea: If we want to believe in something, we ask, “Can I believe it?” and look for reasons to believe. As soon as we find a piece of evidence, even if it’s weak, we stop searching and feel justified in that belief. On the other hand, if we don’t want to believe something, we ask, “Must I believe it?” and look for reasons not to. If we find even one piece of counterevidence, we feel justified in not believing it. In sum, unlike scientists, who generally change their theories in response to the strongest evidence, most people believe what they want to believe.
- We believe any evidence that supports our “team”: This is why people don’t vote based on their self-interest. Rather, people care about their groups—political, racial, regional, religious—and base their decisions on their participation in those groups. For example, when people are shown hypocritical statements made by political leaders in their chosen party, they start squirming and looking for justifications. On the other hand, when they see the same hypocrisy from an opponent, they delight in it and don’t attempt to justify it. Furthermore, when they’re shown a statement that releases their candidate from something that looked hypocritical, they get a hit of dopamine. The brain of the partisan starts to need that dopamine—being a partisan person is literally addictive.
These rationalizations don’t lead or create our morality. Rather, rationalizations happen after we make decisions in order to justify our intuition and convince others (and ourselves) that we’re moral beings. In fact, studies show that expertise in moral reasoning (like being a moral philosophy professor) does not make people any more moral, and might actually make them worse because they’re more capable at making post hoc justifications.
We have evolved not to find truth but to argue, be persuasive, and manipulate when necessary to get others to like us or be on our side. This is why confirmation bias is so strong. We should thus understand that an individual’s power to reason is limited. However, if we put a lot of individuals together, the interactions between them can produce good reasoning if these individuals also have respect for one another. Think about a group of individuals discussing how to fix a car. They might have different ideas about what’s wrong with the car, based on their different experiences, but if they respect one another they’ll be much more likely to come to a solution together than one person trying to fix a car alone. Any group looking for truth should have ideological and intellectual diversity.
If the goal is to produce good behavior, then we should trust intuition over reason even more, and create environments like the one Glaucon believed was necessary for a just society—one where humans concerned about their reputations will be more ethical. This is how caring about what others think of us can be beneficial.
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Jonathan Haidt's "The Righteous Mind" at Shortform .
Here's what you'll find in our full The Righteous Mind summary :
- Why we all can't get along
- How our divergent moralities evolved
- How we can counter our natural self-righteousness to decrease political divides