This article is an excerpt from the Shortform book guide to "Superforecasting" by Philip E. Tetlock. Shortform has the world's best summaries and analyses of books you should be reading.
Like this article? Sign up for a free trial here .
What is attribute substitution? Why does it happen and how does it skew our judgment?
Attribute substitution is an underlying psychological process of a number of cognitive heuristics and perceptual illusions. It occurs when people are faced with a computationally complex judgment, and because of its complexity, they reframe the problem into a more easily calculated attribute.
Read about attribute substitution and what it means.
What Is Attribute Substitution?
When it comes to making judgments, we are not as reasonable as we’d like to think we are. This is especially the case with problems that are computationally complex. When we are faced with such a problem, we have a strong tendency to rely on fast, intuitive processing rather than on deliberate reasoning. Although intuition can sometimes be useful, more often than not it skews our judgment. The key to this bias is a process of attribute substitution—unconsciously substituting the question before us to an easier one that we can answer.
Example: Political Bait and Switch
We expect other people to not only think like us but to think logically. For example, in 2013, a superforecaster named Bill Flack was asked whether the Japanese prime minister would visit the Yasukuni Shrine, a controversial war memorial that includes roughly one thousand war criminals among its honorees. Previous prime ministers had been warned against visiting this particular shrine since these visits have historically led to strong backlash from the neighboring governments of China and South Korea.
Based on the available information, Flack predicted the current prime minister wouldn’t be willing to take that risk and would absolutely not visit Yasukuni. Shortly after this prediction, new information from a source close to the prime minister indicated he was planning to visit the shrine. Flack stood his ground and didn’t update his forecast—visiting the shrine simply made no sense, so the new information must be wrong.
As you may have guessed, Flack was wrong. The prime minister visited Yasukuni later that year. Why? He was a conservative nationalist for whom the political risks were worth the personal gain of paying his respects. Bill Flack’s mistake was to substitute the original question for an easier one—like, “Would I visit the shrine if I were the prime minister?” or “What would be the rational thing to do in this situation?”
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Philip E. Tetlock's "Superforecasting" at Shortform .
Here's what you'll find in our full Superforecasting summary :
- How to make predictions with greater accuracy
- The 7 traits of superforecasters
- How Black Swan events can challenge even the best forecasters