This article is an excerpt from the Shortform summary of "The Black Swan" by Nassim Taleb. Shortform has the world's best summaries of books you should be reading.
Like this article? Sign up for a free trial here .
What is a black swan? What is black swan theory? What does the term “black swan” mean?
A black swan is an extremely unpredictable event that has a massive impact on human society. The Black Swan is named after a classic error of induction, the black swan fallacy.
We’ll cover the meaning of “black swan,” what black swan theory is, and what “black swan” means in economics.
What Is a Black Swan?
The Black Swan is the second book in former options trader Nassim Nicholas Taleb’s five-volume series on uncertainty. What does “black swan” mean? This book analyzes so-called “Black Swans”—extremely unpredictable events that have massive impacts on human society.
The Black Swan is named after a classic error of induction wherein an observer assumes that because all the swans he’s seen are white, all swans must be white. Black Swans have three salient features:
- They are rare (statistical outliers);
- They are disproportionately impactful; and, because of that outsize impact,
- They compel human beings to explain why they happened—to show, after the fact, that they were indeed predictable.
Taleb’s black swan theory, however, is that Black Swans, by their very nature, are always unpredictable—they are the “unknown unknowns” for which even our most comprehensive models can’t account. What is a black swan? The fall of the Berlin Wall, the 1987 stock market crash, the creation of the Internet, 9/11, the 2008 financial crisis—all are Black Swans.
Once Taleb introduces the concept of the Black Swan, he delves into human society and psychology, analyzing why modern civilization invites wild randomness and why humans can neither accept nor control that randomness.
Black Swan Theory: Extremistan vs. Mediocristan
To explain how and why Black Swans occur and what “black swan” means, Taleb coins two categories to describe the measurable facets of existence: Extremistan and Mediocristan.
In Mediocristan, randomness is highly constrained, and deviations from the average are minor. Physical characteristics such as height and weight are from Mediocristan: They have upper and lower bounds, their distribution is a bell curve, and even the tallest or lightest human being isn’t much taller or lighter than the average. In Mediocristan, prediction is possible. In black swan theory, blacks swans don’t live in Mediocristan.
In Extremistan, however, randomness is wild, and deviations from the average can be, well, extreme. Most social, man-made aspects of human society—the economy, the stock market, politics—hail from Extremistan: They have no known upper or lower bounds, their behavior can’t be graphed on a bell curve, and individual events or phenomena—i.e., Black Swans—can have exponential impacts on averages.
What is a black swan? Imagine you put ten people in a room. Even if one of those people is Shaquille O’Neal, the average height in the room is likely to be pretty close to the human average (Mediocristan). If one of those people is Jeff Bezos, however, suddenly the wealth average changes drastically (Extremistan).
Black Swan Theory and the Unreliability of “Experts”
Taleb has very little patience for “experts”—academics, thought leaders, corporate executives, politicians, and the like. Throughout the book, Taleb illustrates how and why “experts” are almost always wrong and have little more ability to predict the future than the average person.
According to black swan theory, there are two reasons “experts” make bad predictions:
1) Human Nature
Because of various habits innate to our species—our penchant for telling stories, our belief in cause and effect, our tendency to “cluster” around specific ideas (confirmation bias) and “tunnel” into specific disciplines or methods (specialization)—we tend to miss or minimize randomness’s effect on our lives. Experts are no less guilty of this blindspot than your average person.
2) Flawed Methods
Because experts both (1) “tunnel” into the norms of their particular discipline and (2) base their predictive models exclusively on past events, their predictions are inevitably susceptible to the extremely random and unforeseen.
Consider, for example, a financial analyst predicting the price of a barrel of oil in ten years. This analyst may build a model using the gold standards of her field: past and current oil prices, car manufacturers’ projections, projected oil-field yields, and a host of other factors, computed using the techniques of regression analysis. The problem is that this model is innately narrow. It can’t account for the truly random—a natural disaster that disrupts a key producer, or a war that increases demand exponentially. This is what black swan theory attempts to explain.
What is black swan theory? Taleb draws a key distinction between experts in Extremistan disciplines (economics, finance, politics, history) and Mediocristan disciplines (medicine, physical sciences). Experts like biologists and astrophysicists are able to predict events with fair accuracy; experts like economists and financial planners are not.
Difficulties of Prediction
According to black swan theory, the central problem with experts is their uncritical belief in the possibility of prediction, despite the mountain of evidence that indicates prediction is a fool’s errand. Some key illustrations of the futility of prediction include:
Discoveries
Most groundbreaking discoveries occur by happenstance—luck—rather than careful and painstaking work. The quintessential example is the discovery of penicillin. Discoverer Alexander Fleming wasn’t researching antibiotics; rather, he was studying the properties of a particular bacterium. He left a stack of cultures lying out in his laboratory while he went on vacation, and when he returned he found that a bacteria-killing mold had formed on one of the cultures. Voilá—the world’s first antibiotic.
Dynamical Systems
A dynamical system is one in which an array of inputs affect each other. Whereas prediction in a system that contains, say, two inputs, is a simple affair—one need only account for the qualities and behavior of those two inputs—prediction in a system that contains, say, five hundred billion inputs is effectively impossible.
The most famous illustration of a dynamical system’s properties is the “butterfly effect.” This idea was proposed by an MIT meteorologist, who discovered that an infinitesimal change in input parameters can drastically change weather models. The “butterfly effect” describes the possibility that the flutter of a butterfly’s wings can, a few weeks later and many miles distant, cause a tornado.
Predicting the Past
The past itself is as unknowable as the future, even the major black swan events. According to black swan theory, because of how complex the world is and how a single event could be influenced by any number of tiny causes, we cannot reverse engineer causes for events.
An example should help illustrate. Think of an ice cube sitting on a table. Imagine the shape of the puddle that ice cube will make as it melts.
Now think of a puddle on the table and try to imagine how that puddle got there.
When historians propose causes for certain historical events, they’re looking at puddles and imagining ice cubes (or a spilled glass of water, or some other cause). The problem is that the sheer number of possible causes for a puddle—or a historical event—render any ascription of cause suspect.
If You Can’t Predict, How Do You Deal with Uncertainty?
Although Taleb is far more concerned with explaining why prediction is impossible than he is with proposing alternatives or solutions, he does offer some strategies for dealing with radical uncertainty in his black swan theory.
1) Don’t Sweat the Small Predictions
When it comes to low-stakes, everyday predictions—about the weather, say, or the outcome of a baseball game—there’s no harm in indulging our natural penchant for prediction: If we’re wrong, the repercussions are minimal. It’s when we make large-scale predictions and incur real risk on their basis that we get into trouble.
2) Maximize Possibilities for Positive Black Swans
Although the most memorable Black Swans are typically the negatively disruptive ones, Black Swans can also be serendipitous. (Shortform note: Love at first sight is an example of a serendipitous Black Swan.)
The black swan theory provides two strategies for opening ourselves up to positive Black Swans: (1) sociability and (2) proactiveness when presented with an opportunity. Sociability puts us in the company of others who may be in a position to help us—we never know where a casual conversation might lead. And proactiveness—for example, taking up a successful acquaintance on an invitation to have coffee—ensures we’ll never miss our lucky break.
3) Adopt the “Barbell Strategy”
The barbell strategy is an important part of Taleb’s black swan theory. When Taleb was a trader, he pursued an idiosyncratic investment strategy to inoculate himself against a financial Black Swan. He devoted 85%–90% of his portfolio to extremely safe instruments (Treasury bills, for example) and made extremely risky bets—in venture-capital portfolios, for example—with the remaining 10%–15%. (Another variation on the strategy is to have a highly speculative portfolio but to insure yourself against losses greater than 15%.) The high-risk portion of Taleb’s portfolio was highly diversified: He wanted to place as many small bets as possible to increase the odds of a Black Swan paying off in his favor.
The “barbell strategy” is designed to minimize the pain of a negative Black Swan while, potentially, reaping a positive Black Swan’s benefits. If the market collapses, a person pursuing this strategy isn’t hurt beneath the “floor” of the safe investments (say, 85%), but if the market explodes, he has a chance to capitalize by virtue of the speculative bets.
4) Distinguish Between Positive Contingencies and Negative Ones
Different areas of society have different exposure to Black Swans, both positive and negative. For example, scientific research and moviemaking are “positive Black Swan areas”—catastrophes are rare, and there is always the possibility of smashing success. The stock market or catastrophe insurance, meanwhile, are “negative Black Swan areas”—upsides are relatively modest compared to the possibility of financial ruin.
Suffice it to say, we should take more risks in a positive Black Swan area than in a negative Black Swan one.
5) Prepare, Don’t Predict
Because Black Swans are, by definition, unpredictable, we’re better off preparing for the widest range of contingencies than predicting specific events.
That’s because, according to black swan theory, though Black Swans themselves can never be predicted, their effects can be. For example, no one can predict when an earthquake will strike, but one can know what its effects will be and prepare adequately to handle them.
The same goes for an economic recession. No one can predict precisely when one will occur, but, using the “barbell strategy” or some other means of mitigating risk, we can at least be prepared for one.
———End of Preview———
Like what you just read? Read the rest of the world's best summary of "Black Swan" at Shortform . Learn the book's critical concepts in 20 minutes or less .
Here's what you'll find in our full Black Swan summary :
- Why world-changing events are unpredictable, and how to deal with them
- Why you can't trust experts, especially the confident ones
- The best investment strategy to take advantage of black swants