This article is an excerpt from the Shortform book guide to "Superforecasting" by Philip E. Tetlock. Shortform has the world's best summaries and analyses of books you should be reading.
Like this article? Sign up for a free trial here .
Is Phillip Tetlock’s Superforcasting worth reading? What was Tetlock’s inspiration in writing the book?
The idea for Superforecasting came from Tetlock’s experiences with the Expert Political Judgment and Good Judgment Project forecasting experiments, during which Tetlock and his team discovered that some people are significantly better at predicting the future than others. Tetlock and Gardner set out to explore exactly what sets “superforecasters” apart from regular people; Superforecasting is the result.
This Superforecasting review takes a look at the book’s intellectual context, background, impact, and discusses its key strengths and weaknesses.
Superforecasting: The Art and Science of Prediction
Superforecasting is the result of decades of research on “superforecasters”: people who can predict future events with an accuracy better than chance. Superforecasters are intelligent, but more importantly, they’re open-minded, deeply curious, and adept at sidestepping their own cognitive biases. Not everyone is cut out to be a superforecaster, but by studying the way superforecasters make predictions, anyone can improve their ability to predict the future.
Below is a quick Superforecasting review. We’ll cover the book’s context, background, and critical reception.
About the Authors
Phillip Tetlock is a Canadian-American author and researcher focusing on good judgment, political psychology, and behavioral economics. He currently teaches at the University of Pennsylvania in the Wharton School of business as well as the departments of psychology and political science.
In 2011, Tetlock and his spouse, psychologist Barbara Mellers, co-founded the Good Judgment Project, a research project involving more than twenty thousand amateur forecasters. The Project was designed to discover just how accurate the best human forecasters can be and what makes some people better forecasters than others. It led to Tetlock discovering a group of “superforecasters” who could outperform professional intelligence analysts; these findings eventually inspired Superforecasting.
Today, the Good Judgment Project hosts training sessions on general forecasting techniques for both individuals and organizations as well as producing custom forecasts.
Connect with Philip Tetlock:
Dan Gardner is a Canadian journalist and co-author of Superforecasting. Gardner has authored two other books on the psychology of decision-making and prediction: Risk: The Science and Politics of Fear in 2008, which delves into the science of how we make decisions about risky situations; and Future Babble, in 2011, which explores Tetlock’s earlier research on experts’ inability to accurately predict the future. He is a senior fellow at the University of Ottawa’s Graduate School of Public Policy and International Affairs.
Connect with Dan Gardner:
The Book’s Publication
Publisher: Crown Publishing Group, a subsidiary of Penguin Random House
Superforecasting was published in 2015. It is Tetlock’s fourth book (Gardner’s third) and is the most well-known book in both authors’ respective bibliographies.
Superforecasting builds on Tetlock’s previous book, Expert Political Judgment, in which he first described the results of the Good Judgment Project and answered the question, “Why are political experts so bad at making predictions?” In Superforecasting, Tetlock turns his attention away from experts’ failures and toward the successes of a few average people who can predict global events more accurately than chance.
The Book’s Intellectual Context
Superforecasting fits in with the tidal wave of research on issues of predictability, uncertainty, and cognitive biases that happened in the 2000s and 2010s. Superforecasting directly references Daniel Kahneman’s Thinking, Fast and Slow; Kahneman’s pioneering research on metacognition and cognitive biases paved the way for Tetlock’s study of the limits of forecasting. Tetlock and Gardner also directly reference Nassim Nicholas Taleb, author of several books on uncertainty, including The Black Swan, Antifragile, and Fooled by Randomness. Taleb is deeply critical of forecasting as an enterprise, and the authors devote an entire section of Superforecasting to addressing his critiques.
At its core, Superforecasting is an exploration of how superforecasters think, with a particular emphasis on the way they avoid knee-jerk assumptions and cognitive biases. This presents a helpful counterpart to books like Dan Ariely’s Predictably Irrational and Malcolm Gladwell’s Blink, both of which explore the risks and benefits of subconscious thinking in more depth.
The Book’s Impact
The intellectual impact of Superforecasting was strongest within the fields of forecasting and behavioral science. For example, poker champion and author Annie Duke references Superforecasting in her 2018 book, Thinking in Bets, in which she builds on Tetlock’s research on avoiding cognitive biases and creating teams of like-minded people to help make decisions.
Superforecasting also played a small role in a British government scandal. Dominic Cummings, former aide to UK Prime Minister Boris Johnson, told reporters to read the book instead of listening to “political pundits who don’t know what they’re talking about.” He even wrote his own glowing review of the book on his personal blog. Cummings also hired Andrew Sabisky, a superforecaster, as an advisor. However, Sabisky resigned after only a few days on the job after old blog posts surfaced in which he claimed that Black people have lower IQs than white people and that the government should adopt eugenics policies to prevent the growth of the “lower class.” For many people, this scandal was their first encounter with the concept of “superforecasting.”
The Book’s Strengths and Weaknesses
Superforecasting was generally well-received by critics and readers. People in a wide variety of fields—including real estate, ecology, management, and actuarial science—gave Superforecasting positive reviews and recommended it to anyone in their respective industries who wants to improve their decision-making skills. A New York Times reviewer praised the book for providing practical advice on how to make better forecasts rather than just describing the forecasting process. Some reviewers even tried their hand at making their own forecasts after reading the book.
Critical reviews of Superforecasting focus not so much on the book itself as on the utility of forecasting as a practice. One reviewer argued that, while superforecasters may be able to make accurate predictions about very specific events in the near future, the types of events they predict are not the ones we should be most worried about. Instead, we should focus on the events that are likely to have the biggest impact on society—which are likely to be so rare that they are completely unpredictable. These events are what author Nassim Nicholas Taleb calls “black swans,” which we’ll explore in depth in this guide.
Commentary on the Book’s Approach
While Superforecasting has two official authors, the book is written in Tetlock’s voice as he describes his personal experience conducting decades of research on forecasting tournaments. Tetlock and Gardner accurately represent the criticisms other authors (such as Taleb) have levied against formal forecasting and make a compelling case for the utility of forecasting as an enterprise, especially on a national and global scale. The authors also describe the techniques that superforecasters use in enough detail that readers come away well-equipped to try making their own predictions. (If you’d like to test your forecasting skills after reading, you can take on one of the challenges in the Good Judgment Open, an online, ongoing forecasting tournament.)
Commentary on the Book’s Organization
Superforecasting begins with an overview of Tetlock’s approach to forecasting, including both his optimism about the strengths of “superforecasters” as well as his understanding that, while superforecasters make more accurate predictions than other forecasters, there is still a hard limit to how far any human can see into the future. The authors then discuss the difficulties of measuring a forecaster’s accuracy, how the lack of a solid measurement system has significantly hindered forecasting research, and how Tetlock’s “Expert Political Judgment” and “Good Judgment Project” forecasting tournaments changed the landscape of forecasting research. The bulk of Superforecasting describes the skills and traits of the superforecasters themselves with the goal of advancing the authors’ thesis: that the skills that make superforecasters so “super” aren’t inborn—they can be developed with practice.
The book’s organization serves the authors’ goal of introducing the reader to the world of formal forecasting, holding up superforecasters as a model of the full potential of human forecasting, and arguing for the importance of forecasting tournaments. The authors seem to understand that the kind of formal, geopolitical forecasting that goes on in forecasting tournaments is completely foreign to many readers, so they begin with very general concepts (like the idea that, to a certain degree, the future is knowable) before getting into more specific details of what makes superforecasters so good at what they do.
One downside to the authors’ decision to move from broad concepts to specific details is that certain core ideas are separated into multiple parts of the book, which can be confusing for the reader. For example, the sections on calculating Brier scores and measuring regression to the mean—two ways of measuring forecasters’ performance—are located in two different chapters.
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Philip E. Tetlock's "Superforecasting" at Shortform .
Here's what you'll find in our full Superforecasting summary :
- How to make predictions with greater accuracy
- The 7 traits of superforecasters
- How Black Swan events can challenge even the best forecasters