This article is an excerpt from the Shortform book guide to "Rationality" by Steven Pinker. Shortform has the world's best summaries and analyses of books you should be reading.
Like this article? Sign up for a free trial here.
What does it mean to be rational? Why is being rational important? How can you be a more rational person?
According to Steven Pinker, rationality and reason are essential for improving our world and society. But, people often misunderstand them, acting irrationally even when they think they’re not. In his book Rationality, he unpacks this mystery and seeks to help people think and behave more rationally.
Continue reading for an overview of this bestselling book.
Overview of Steven Pinker’s Rationality
Steven Pinker’s Rationality examines how you can be more rational and make better decisions by improving your critical thinking skills and by understanding—and thus avoiding—the logical fallacies and cognitive blunders that people often fall victim to.
Steven Pinker is a cognitive psychologist whose bestselling works include The Stuff of Thought (2008), The Better Angels of Our Nature (2011), and Enlightenment Now (2018). He’s a member of the National Academy of Sciences and has taught at several universities including Harvard, Stanford, and MIT. As an experimental psychologist, he’s conducted research on visual cognition, psycholinguistics, and social relations.
We’ll explore Pinker’s definition of rationality, his argument for why it matters, and his advice on how you can think more rationally about your choices. We’ll also go over the key reasons why humans, despite our ingenuity, often think and behave so irrationally.
What Is Rationality?
Pinker defines rationality as the use of knowledge to attain goals. Within this definition, knowledge is a belief that can be proven true. Both of these aspects—goals and true, provable knowledge—are critical aspects of rationality. You’re not acting rationally if you act on false beliefs, nor are you acting rationally if you don’t apply your beliefs to a goal—a rational person must have a purpose for their thinking, whether that be to get around a physical object or to determine the truth of an idea.
Why Rationality Matters
Pinker contends we as humans have a moral obligation to think rationally, for two reasons.
First, rationality is moral because it’s through rationality that we build a better society. Reason has enabled us to make valuable discoveries, inventions, and social progress that have allowed humans to live better lives. Because we have a moral imperative to improve the world and the lives of others, and since rationality is a key aspect of our ability to do so, we have a moral obligation to think rationally.
Second, morality hinges on agreement on right and wrong. Reason allows us to think impartially, consider conflicting interests, and determine what contributes to the common good. Because rationality is the only thing that allows us to come to a collective consensus on anything, it’s the only way to prevent one group from deciding what’s right and forcing their ideas on others, which is inherently immoral.
Rational Choice Theory
Pinker explains that economists, philosophers, and others have defined rationality using the rational choice theory, which states that people will choose between options with the aim of maximizing the rewards of their decision.
More specifically, when making a rational decision, a person considers the possible outcomes of each option, judges how much they want (or don’t want) each outcome, and considers how likely each outcome is to happen. This analysis yields an option’s “expected utility”: its potential rewards weighted by both desirability and probability. A rational person then chooses the option with the highest expected utility.
For example, if someone is deciding if they should pack a warm coat for a trip, they’ll first weigh the possible outcomes of each option by desirability. If the weather gets cold, they’ll be happy they packed the coat. But if the weather ends up being warm, they’ll be annoyed that they wasted space in their suitcase on such a bulky and heavy item. They’ll then weigh the likelihood of each option: Is it more likely the weather will be cold or warm? And from there, they’ll decide which option has the highest expected utility, and they’ll make that choice.
Pinker notes that one person’s judgment of expected utility might differ from someone else’s if their values are different. This doesn’t mean that either choice is more or less rational than the other—they can be different but still be equally rational. If Joe is sensitive to cold, he might opt to bring a coat even if it might mean overpacking, while Jenny, who doesn’t mind the cold as much but hates overpacking, might leave her coat at home even if it might mean being chilly. Both choices are rational because they each align with the expected utility as determined by each chooser.
Pinker notes that, while people may not have the time nor inclination to thoroughly calculate the expected utility of every decision, if they think in these terms more often, they’d make better decisions overall. In the following sections, we’ll look at reasons people fail to think rationally and techniques people can use to think critically more often.
Logic and Critical Thinking
Pinker writes that one of the main reasons people think irrationally is that they use logic and critical thinking incorrectly. Critical thinking (also known as deductive reasoning) is the ability to accurately assess logic—to judge whether a conclusion is true based on its premises.
People often misinterpret logic and thus engage in fallacious arguments that lead to irrational conclusions. Pinker outlines two kinds of fallacies that people tend to succumb to: formal (where the conclusion is wrong) and informal (where the premises are wrong).
Formal Fallacies
A formal fallacy is one where the conclusion doesn’t follow logically from the premise—thus violating the form (the structure) of a logical statement. This means the premises are valid and true, but the conclusion drawn from them doesn’t follow logically and therefore isn’t valid.
One common formal fallacy is called denying the antecedent (in other words, saying the first premise isn’t true, and then drawing a false conclusion):
- Premise: A equals B.
- Premise: Not A.
- Conclusion: Therefore, not B.
Here, the conclusion is fallacious because it might not always be the case. For example, you might say:
- If a creature is a fish, it can swim.
- A human creature is not a fish.
- Therefore, a human cannot swim.
In this case, both premises are valid, but the conclusion is false.
Another common formal fallacy is called affirming the consequent (or, saying the second premise is true and then drawing a false conclusion):
- Premise: A equals B.
- Premise: B.
- Conclusion: Therefore, A.
This might play out as:
- If a creature is a fish, it can swim.
- Humans can swim.
- Therefore, humans are fish.
This type of fallacy, affirming the consequent, is a common one people fall for because it implies a reciprocity that seems straightforward but is often not true: Just because A equals B, we can’t necessarily say B equals A. However, without deeper reflection, that conclusion sometimes does seem valid.
This fallacy can lead us to poor decisions. For example, “Groundbreaking, blockbuster products are always ones that are new to the market” does not mean “Products that are new to the market are always groundbreaking blockbusters.” But, if an entrepreneur convinces us her new product is guaranteed to be successful simply because no one’s ever seen it before (relying on this fallacy), we might lose money on a poor investment.
Informal Fallacies
Unlike formal fallacies, informal fallacies aren’t invalid. That is, the conclusion logically follows from its premises. However, the premises are faulty or irrelevant.
One example is the straw man fallacy, in which an argument is intentionally simplified, falsified, or otherwise misrepresented. For example, if a school principal says, “We must introduce more science into our curriculum,” someone using a straw man fallacy might retort, “So, you don’t care about the arts?” This conclusion relies on the premise that if a principal cares about math, she doesn’t care about the arts—a faulty premise.
Pinker writes that people create informal fallacies because they like to win arguments and they’ll take shortcuts to do so: Instead of building solid arguments based on true premises, they’ll make arguments that sound logical and will hope no one examines their premises too closely.
Probability
Pinker writes that an important aspect of rationality is understanding how probability works and using it to make better decisions.
Probability can be best understood as the chance that an event will happen given the opportunity. When someone says there is a 50% chance of an event occurring, this means it will happen 50 out of 100 times, on average. If you flip a coin 100 times, the head-to-tail ratio may not be exactly 50-50, but there’s still a 50% chance of either heads or tails with every flip.
People often miscalculate the probability of an event occurring and then make poor decisions based on that miscalculation. In this section, we’ll cover a few key reasons why this happens—how certain cognitive biases (or, heuristics) that we hold misdirect us, and what we can do to prevent that.
The Availability Heuristic
According to the availability heuristic, people judge the likelihood of an event based on how readily they remember it happening before, rather than by rationally calculating the probability based on how often the event actually happens. That is, they rely on what information is most available, rather than what is most representative of the truth.
A commonly cited example is when people are more afraid of flying on an airplane than driving in a car. The probability of getting injured in a car is higher than in a plane, but people remember plane crash headlines better. They therefore falsely believe plane crashes are more probable. This false belief can be harmful: Pinker notes that by irrationally preferring to drive, many have likely driven to their deaths rather than fly on a safer aircraft.
Post Hoc Probability Fallacies
Another common probability blunder Pinker discusses is the post hoc probability fallacy. This is when, after something statistically unlikely occurs, people believe that because it happened, it was likely to happen. They fail to account for the number of times the event could have occurred but didn’t, and the probability that, given an enormous data set (an almost infinite number of events), coincidences are going to happen.
Post hoc probability fallacies are driven by the human tendency to seek patterns and ascribe meaning to otherwise meaningless or random events. It leads to superstitious beliefs like astrology, psychic powers, and other irrational beliefs about the world. It’s why, for example, if a tragedy occurs on a Friday the 13th, some will believe it’s because that day is cursed, ignoring the many Friday-the-13th dates that have passed with no tragedy, or the many tragedies that have occurred on dates other than Friday the 13th.
Using Bayesian Reasoning to Counter Probability Fallacies
Pinker says that to prevent falling for fallacies like these, we can use Bayesian reasoning, which is a mathematical theorem named after an eighteenth-century thinker that calculates how we can base our judgments of probability on evidence (information showing the actual occurrences of an event). We won’t detail the full mathematical equations of the Bayesian theorem here, but essentially, it helps you consider all the relevant probabilities associated with a possible outcome to determine its true likelihood, which can sometimes differ greatly from the likelihood that seems most accurate.
One common use of this theorem is in determining the probability of a medical diagnosis being correct, which Pinker says is an archetypal example where Bayesian reasoning can aid in accurate assessments of probability. Let’s say you test positive for cancer. Most people (including many medical professionals) might believe that because the test came back positive, there’s an 80-90% chance you have the disease.
However, once other relevant probabilities are taken into account, the true risk is revealed to be much less than that. In this case, if the cancer in question occurs in 1% of the population (this is the evidence of the event), and the test’s false positive rate runs at 9%, then the true likelihood that you have cancer, after receiving a positive test, is just 9%.
In everyday life, we can use Bayesian reasoning without resorting to plugging in numbers by following three general rules:
- Give more credence to things that are more likely to be true. If a child has blue lips on a summer day just after eating a blue popsicle, it’s more likely that the popsicle stained her lips than that she has a rare disease causing the discoloration.
- Give more credence to things if the evidence is rare and associated closely with a particular event. If that child with blue lips has not been eating popsicles, but instead also presents with a rash and fever, the probability that the discoloration is caused by a disease increases.
- Give less credence to things when evidence is common and not closely associated with a particular event. If a child does not have discolored lips or any other symptoms of illness, there’s no rational reason to think she has the rare disease, even if some people with that disease have no symptoms.
Correlation Versus Causation
Pinker next turns his attention to problems that people run into when considering causation and correlation, which drive them to make irrational decisions.
A common mistake people make is thinking that events that are correlated (they often happen at the same time) are causing each other, when in fact they might be linked simply by coincidence or by a third factor. This can lead people to make poor decisions—when they think the wrong event causes another, they incorrectly predict the future.
For example, if the stock price of a company always rises in November, a person might think the arrival of November causes the price to rise, and they might then buy stock in October in anticipation of that rise. However, if the true reason behind the price increase is that the stock rises when the company offers a huge sale on their goods, which they happen to always offer in November, then the person buying stock in October might lose money if the company decides not to offer that sale this particular November. If the person had correctly identified the causal link (between the sale and the price rise, instead of between the month and the price rise), they might have purchased their stock at a better time.
Pinker notes that it can be difficult to determine causation, especially when there are multiple events or characteristics to account for. Complicating matters is that, very often, correlation does imply some sort of causation: If two events are commonly linked, they likely have a common source (as in the stock price example above).
When determining what factors cause other factors and which are merely correlated, Pinker notes that you can do one of two things:
- Run experiments.
- Analyze data.
Experiment to Determine Causation
Pinker recommends running a “natural experiment” to identify which events cause other events. To do so, you’d divide a sample population into two groups, change some characteristics in one group, and see how (or if) those changes affect the situation for that group. Such experiments are excellent ways to measure precisely how a factor might affect change and to determine which might only be correlated with other factors.
There are limits to these experiments, though. You might fail to account for variables that affect your results (if you’re studying mostly young adults, for example, you might miss how a change would affect a broader population), and there are ethical limits to how much you can change real-world elements. You can’t, for example, force two countries to go to war just so you can examine the effects on food pricing.
Analyze Data to Determine Causation
When you can’t run an experiment, you can look for patterns in existing sets of data that might shed light on how one factor affects another. Pinker mentions two factors in particular that you should analyze data for when determining causation: chronology and nuisance variables.
Chronology: You can often judge which factors affect others, and not in reverse, by noting which factors occurred first. For example, in economic data, if prices across multiple countries rise before wages rise, but wages never rise before prices rise, that indicates that price increases drive wage increases and not the reverse.
Nuisance variables: You can control for factors that are associated with events but don’t cause them (nuisance variables) by matching such factors in different contexts and looking at how other data changes within those matched sets. For example, if you’re examining how alcohol consumption affects longevity, you might want to consider how exercise might skew your results. You could examine two groups of people, one that drinks and one that doesn’t, and match up individuals from both groups who also exercise. Any differences in longevity would then not be due to differences in exercise but would be more closely related to alcohol consumption.
Rationality and Game Theory
Thus far, we’ve examined how people make rational decisions at an individual level. We’ll now look at how people make decisions as part of a group. This brings us to game theory, which examines how rationality is affected when the needs of an individual are pitted against the needs of others.
We’ll first look at how game theory shows that sometimes, acting irrationally can be the most rational choice, and we’ll then examine how people can be convinced to make rational choices when they’ll only see benefits from those choices if everyone else chooses rationally as well.
The Zero-Sum Game
Pinker first examines how game theory shows that sometimes, a rational person must make choices that are, on their face, irrational, such as when opposing another person in a competition. This happens in a zero-sum game—a match-up that produces one winner and one loser (so that the “positive” win and “negative” loss add up to a sum of “zero”). In such a contest, unpredictability has an advantage, as it prevents the other person from preparing a response.
This is why, for example, a tennis player will try to serve the ball unpredictably, even if, say, her strongest serve is to the right side of the court. If she acts “rationally” and always serves her strongest serve (to the right), her opponent will predict it and prepare to meet it. Thus, her most rational move is to act randomly and irrationally.
Volunteer’s Dilemma
A volunteer’s dilemma is another situation in which a person’s best choice might be an irrational one. In this dilemma, one person must do something dangerous to help the group as a whole. If they’re successful, they’ll save everyone (including themselves), but if they fail, everyone will suffer—and they’ll suffer most of all.
For example, let’s say you’re marooned with a group of friends on an island and to get off the island, one of you must swim across shark-infested waters to get help. If you succeed, everyone will be saved, but if you fail, the rescue boats won’t know where the group is—and you’ll be eaten by sharks. The question becomes, who will volunteer for such a task?
A volunteer’s dilemma is similar to a zero-sum game in that the incentives of the individuals are in conflict—no one wants to be the one entering the water. However, the end result of this dilemma is not zero-sum: If the volunteer succeeds, everyone wins. If the volunteer fails, everyone loses.
In such a situation, everyone’s individual rational choice is to let someone else volunteer and put themselves in danger. However, if no one volunteers, everyone loses. Thus, in order to ultimately choose rationally so that everyone has a chance of survival, someone will have to irrationally put themselves in danger.
The Tragedy of the Commons
The tragedy of the commons is a dynamic that applies to situations involving shared resources, where everyone in a group has an individual incentive to take as much of that resource for themselves as possible and contribute as little as possible, which ultimately harms everyone. For example, each fisher in a village will be incentivized to catch as many fish as they can, so that others don’t take them first. Unfortunately, if everyone is fishing aggressively, the stock is soon depleted and then no one has enough.
The same dynamic shows up in any situation where a public good is shared, be it roads, schools, or a military force—everyone benefits from using these things, but each individual benefits more if others pay for them. This dynamic also affects how the world’s environmental crisis plays out, as each individual or country is incentivized to consume energy and resources as they wish, hoping that others will curtail their own use. However, those others have the same incentives to use as much as they want to, too.
Pinker writes that the most effective way to manage this dilemma is to remove the choice from individuals and instead have an outsider regulate people’s decisions—specifically, a government or organization that oversees how much each individual can take from the shared resource and establishes rules or contracts that individuals must abide by. When “free riders” are punished for taking too much or not contributing enough (through fines, for example, for failing to pay taxes), everyone is more likely to refrain from the self-benefiting behavior that can drain a public resource because they can trust that others are also refraining.
Why Humans Are Irrational
So, Pinker asks, given that most people agree on the importance of rationality and its basic characteristics, why do people often act irrationally? Why do people hold irrational beliefs—such as paranormal phenomena or conspiracy theories?
Pinker notes that social media has allowed people to express their irrational beliefs loudly, which makes it seem that irrational thought is a growing and recent phenomenon, but, he argues that people have believed these types of ideas for millennia. Aside from all the fallacies and biases that we just covered, Pinker discusses two additional causes of such human irrationality in the modern world: motivated reasoning and myside thinking.
Motivated Reasoning
Pinker writes that rationality, by itself, is unmotivated. That is, a rational line of thought doesn’t desire to end in a certain place, but instead follows its logic to wherever its premises and conclusions lead it. However, sometimes a rational line of thought points to an end that the thinker doesn’t desire, like when it’s clear that the fair thing to do in a situation requires the reasoner to do something unpleasant. When this happens, a person might fall back on motivated reasoning—the use of faulty logic to arrive at a desired conclusion.
We see people engaging in motivated reasoning when, for example, they justify purchasing an extravagant car by saying they like its fuel efficiency. In such a case, their true motivation is that they simply want the car, and they find a reason to justify that desire. We also see motivated reasoning when people choose to ignore certain facts that don’t support their worldview—like when a favored politician does something wrong. And, it’s behind many conspiracy theories, such as when someone who doesn’t want to believe in climate change dismisses scientific data as manipulated despite a lack of evidence to that effect.
Pinker says that people engage in motivated reasoning so frequently, it suggests that our instinct to try to win arguments developed in tandem with our ability to reason. We’ve evolved not only to think logically but equally, to convince others of our logic, even when flawed. According to this theory, the evolutionary advantage of this instinct is that it leads to stronger collective conclusions—people are eager to pass off weak arguments of their own but are quick to point out flaws in the arguments of others, and in doing so, the group as a whole ends up at the right answer. He points to studies that show small groups are better able to arrive at a correct answer than individuals are—as long as one group member can spot the right argument, the others are quickly convinced.
Myside Thinking
Myside thinking is the tendency to irrationally favor information or conclusions that support your group. It’s largely driven by our desire to be part of a collective, and in this way, is rational. If your goal is to be respected and valued by your peers, it makes sense to express opinions and even see things in a way that earns this respect. But, the result is that you’ll likely think and behave irrationally, overlooking the logical flaws of arguments that support your own side while fixating on the flaws of the other side.
Pinker writes that virtually everyone is susceptible to myside thinking—no matter their political affiliation, race, class, gender, education level, or awareness of cognitive biases and fallacies—and that myside thinking is driving the heated political climate of recent years. He points to studies that show liberals and conservatives will accept or refute a conclusion or scientific evidence based on whether or not it supports their predetermined notions (not based on whether or not it’s well-argued or supported). Additionally, if an invalid logical statement supports a liberal idea, a conservative will be more likely to spot the fallacy, and vice versa.
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Steven Pinker's "Rationality" at Shortform.
Here's what you'll find in our full Rationality summary:
- Why rationality and reason are essential for improving our world and society
- How you can be more rational and make better decisions
- How to avoid the logical fallacies people often fall victim to