This article is an excerpt from the Shortform book guide to "The Righteous Mind" by Jonathan Haidt. Shortform has the world's best summaries and analyses of books you should be reading.
Like this article? Sign up for a free trial here .
Why does moral intuition kick in before moral reasoning? Is reasoning a total slave to emotion?
Social psychologist Jonathan Haidt’s experiments indicate that we react first with moral intuition, and then we employ reason. In his book The Righteous Mind, he provides additional evidence for this conclusion and explains why this is the way we make moral judgments.
Read more to learn about moral intuition.
Intuition and Reasoning Are Like an Elephant and Its Rider
We respond to stimuli first with our moral intuition and then use reasoning later to justify our response. This article will deepen that understanding and provide examples of why and how this happens.
Think of intuition and reasoning like an elephant and its rider:
- The elephant (intuition) is huge and moves mostly on its own.
- Occasionally, though, the rider (reasoning) tries to guide it somewhere. This can be useful because, from the rider’s perch, it’s easier to see far into the future (or perform logical scenarios in our heads). The rider can also learn new technologies and skills, which can help the elephant get to its goals. Finally, the rider is the elephant’s spokesperson, even though the rider can’t always tell what the elephant is thinking.
We’ll begin with moral intuition and move on to moral reasoning.
Moral Intuition First
In addition to Haidt’s experiments, there’s ample evidence that intuition comes before reasoning:
- Our brains are instantly evaluating: Every time we see something, we have what’s called an “affect” reaction. From something as simple as reading a positive word, like “happiness,” we get a little bit of positive effect. This sort of feeling is the first process that humans developed evolutionarily—thinking came second.
- All of our social or political judgments are intuitive: People have immediate and intense reactions when they see social groups. Most people have implicit biases against certain groups as well—think of it like the elephant seeing something and leaning away. This doesn’t have to do with any reasoned morality. For example, most younger people are biased towards the elderly, but not based on a moral reason—they have inherent biases against people who aren’t like them who they cannot understand. The same is true with politics—when conservatives see the name of a liberal president, they have an immediately negative reaction. When liberals see the name of a conservative president, they also have a negative reaction.
- Our bodies influence our judgment: For example, seeing, tasting, or smelling something disgusting can make us judgmental. A grad student at Stanford tested this theory by giving people a questionnaire to fill out about their opinions on controversial issues. One group of respondents answered in a place that smelled bad and another did not. The group that filled out the questionnaire in the room that smelled bad gave much harsher opinions on the issues. The reaction to the smell was pure intuition and overpowered respondents’ reason.
- Psychopaths can reason but can’t feel: When the elephant isn’t functioning properly, it’s difficult to be a productive member of society. Psychopaths have some emotions but don’t have empathy for others. This allows them to break all kinds of social contracts that bind society together, as they don’t care about torturing or killing others. The rider functions fine, but the elephant doesn’t respond. In basing their moral judgments only on reason, psychopaths often break basic social contracts that require people to make decisions based on emotion.
- Babies can feel but can’t reason: Experiments prove that babies, while they can’t yet reason, have an innate understanding of their environment. They’ll stare at something longer if it appears to be physically impossible, like a car traveling through a wall. They can understand social interactions as well. If shown a puppet show with three puppets, one puppet helping another trying to get up a hill and a third puppet trying to stop them, babies will register surprise when one of the puppets attempting to get up the hill befriends the hinderer. By the time they are six months old, infants develop a preference for people who are nice to others, outside of their own needs.
- Affective reactions happen in the right place, right time: The famous “trolley problem” (you are told that pushing one person to his death will save five lives) pits utilitarianism against deontology. Utilitarianism suggests that you should push because you are doing an overall good. Deontology says that you have a duty to others’ individual rights to not push. The truth of the matter is deontology generally comes from a gut feeling, where utilitarianism is more calculating, based on reason. Studies show that areas of the brain involved with emotional processing activate immediately when exigent harm is involved. We feel strongly that certain actions are okay and others are not, and when immediate harm is involved our brain reacts to those feelings, making it unlikely we’d push someone into harm’s way in the moment.
The bottom line is that when we see or hear anything in the world, the elephant, emotion or intuition, reacts right away, before reason has a chance to.
Reasoning Second
There are, though, certain cases when reason can make us revise our moral intuition, so reason might not be a total slave to passion. The elephant is more powerful, but not all-powerful.
- We mostly change our minds on questions of morality by interacting with others. Someone else can more easily poke holes in our logic than we can. This is especially true if there’s affection between the two parties. If we like someone, we’ll be more willing to attempt to find truth in what they are saying. This can lead to a changed mind.
- There are also times when we have conflicting intuitions. The elephant wants to lean two different ways at once. We start off by following one intuition and then change our own mind, but generally only if we had conflict within ourselves from the start.
It’s also possible, though rare, for someone to reason themselves to a different conclusion. Studies show this happens if we have to sit and think about our own arguments before answering a question. We can reason a counterargument to our intuition, but only if we have time to do so.
The more we understand the roles that moral intuition and moral reasoning play in our moral judgments, the better we understand ourselves and each other.
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Jonathan Haidt's "The Righteous Mind" at Shortform .
Here's what you'll find in our full The Righteous Mind summary :
- Why we all can't get along
- How our divergent moralities evolved
- How we can counter our natural self-righteousness to decrease political divides