PDF Summary:The Black Swan, by Nassim Nicholas Taleb
Book Summary: Learn the key points in minutes.
Below is a preview of the Shortform book summary of The Black Swan by Nassim Nicholas Taleb. Read the full comprehensive summary at Shortform.
1-Page PDF Summary of The Black Swan
Black Swans are extremely unpredictable events that have massive impacts on human society. These include positive Black Swans, like the invention of the Internet and the discovery of antibiotics, as well as negative Black Swans, like the 2008 recession.
Black Swans compel people to explain why they happened—to show, after the fact, that they were indeed predictable. Taleb’s thesis, however, is that Black Swans, by their very nature, are always unpredictable. Furthermore, because of our cognitive biases, we’re more vulnerable than ever to misunderstanding Black Swans and their impact.
Learn what Black Swans are, why we’re susceptible to them, and how best to prepare yourself for the unpredictable.
(continued)...
Difficulties of Prediction
The central problem with experts is their uncritical belief in the possibility of prediction, despite the mountain of evidence that indicates prediction is a fool’s errand. Some key illustrations of the futility of prediction include:
Discoveries
Most groundbreaking discoveries occur by happenstance—luck—rather than careful and painstaking work. The quintessential example is the discovery of penicillin. Discoverer Alexander Fleming wasn’t researching antibiotics; rather, he was studying the properties of a particular bacterium. He left a stack of cultures lying out in his laboratory while he went on vacation, and when he returned he found that a bacteria-killing mold had formed on one of the cultures. Voilá—the world’s first antibiotic.
Dynamical Systems
A dynamical system is one in which an array of inputs affect each other. Whereas prediction in a system that contains, say, two inputs, is a simple affair—one need only account for the qualities and behavior of those two inputs—prediction in a system that contains, say, five hundred billion inputs is effectively impossible.
The most famous illustration of a dynamical system’s properties is the “butterfly effect.” This idea was proposed by an MIT meteorologist, who discovered that an infinitesimal change in input parameters can drastically change weather models. The “butterfly effect” describes the possibility that the flutter of a butterfly’s wings can, a few weeks later and many miles distant, cause a tornado.
Predicting the Past
The past itself is as unknowable as the future. Because of how complex the world is and how a single event could be influenced by any number of tiny causes, we cannot reverse engineer causes for events.
An example should help illustrate. Think of an ice cube sitting on a table. Imagine the shape of the puddle that ice cube will make as it melts.
Now think of a puddle on the table and try to imagine how that puddle got there.
When historians propose causes for certain historical events, they’re looking at puddles and imagining ice cubes (or a spilled glass of water, or some other cause). The problem is that the sheer number of possible causes for a puddle—or a historical event—render any ascription of cause suspect.
If You Can’t Predict, How Do You Deal with Uncertainty?
Although Taleb is far more concerned with explaining why prediction is impossible than he is with proposing alternatives or solutions, he does offer some strategies for dealing with radical uncertainty.
1) Don’t Sweat the Small Predictions
When it comes to low-stakes, everyday predictions—about the weather, say, or the outcome of a baseball game—there’s no harm in indulging our natural penchant for prediction: If we’re wrong, the repercussions are minimal. It’s when we make large-scale predictions and incur real risk on their basis that we get into trouble.
2) Maximize Possibilities for Positive Black Swans
Although the most memorable Black Swans are typically the negatively disruptive ones, Black Swans can also be serendipitous. (Shortform note: Love at first sight is an example of a serendipitous Black Swan.)
Two strategies for opening ourselves up to positive Black Swans are (1) sociability and (2) proactiveness when presented with an opportunity. Sociability puts us in the company of others who may be in a position to help us—we never know where a casual conversation might lead. And proactiveness—for example, taking up a successful acquaintance on an invitation to have coffee—ensures we’ll never miss our lucky break.
3) Adopt the “Barbell Strategy”
When Taleb was a trader, he pursued an idiosyncratic investment strategy to inoculate himself against a financial Black Swan. He devoted 85%–90% of his portfolio to extremely safe instruments (Treasury bills, for example) and made extremely risky bets—in venture-capital portfolios, for example—with the remaining 10%–15%. (Another variation on the strategy is to have a highly speculative portfolio but to insure yourself against losses greater than 15%.) The high-risk portion of Taleb’s portfolio was highly diversified: He wanted to place as many small bets as possible to increase the odds of a Black Swan paying off in his favor.
The “barbell strategy” is designed to minimize the pain of a negative Black Swan while, potentially, reaping a positive Black Swan’s benefits. If the market collapses, a person pursuing this strategy isn’t hurt beneath the “floor” of the safe investments (say, 85%), but if the market explodes, he has a chance to capitalize by virtue of the speculative bets.
4) Distinguish Between Positive Contingencies and Negative Ones
Different areas of society have different exposure to Black Swans, both positive and negative. For example, scientific research and moviemaking are “positive Black Swan areas”—catastrophes are rare, and there is always the possibility of smashing success. The stock market or catastrophe insurance, meanwhile, are “negative Black Swan areas”—upsides are relatively modest compared to the possibility of financial ruin.
Suffice it to say, we should take more risks in a positive Black Swan area than in a negative Black Swan one.
5) Prepare, Don’t Predict
Because Black Swans are, by definition, unpredictable, we’re better off preparing for the widest range of contingencies than predicting specific events.
That’s because, though Black Swans themselves can never be predicted, their effects can be. For example, no one can predict when an earthquake will strike, but one can know what its effects will be and prepare adequately to handle them.
The same goes for an economic recession. No one can predict precisely when one will occur, but, using the “barbell strategy” or some other means of mitigating risk, we can at least be prepared for one.
Want to learn the rest of The Black Swan in 21 minutes?
Unlock the full book summary of The Black Swan by signing up for Shortform.
Shortform summaries help you learn 10x faster by:
- Being 100% comprehensive: you learn the most important points in the book
- Cutting out the fluff: you don't spend your time wondering what the author's point is.
- Interactive exercises: apply the book's ideas to your own life with our educators' guidance.
Here's a preview of the rest of Shortform's The Black Swan PDF summary:
PDF Summary Shortform Introduction
...
Taleb’s knowledge is encyclopedic. Over the course of the text’s four parts, he discusses mathematics, economics, philosophy, statistics, psychology, political science, history, physics, and literature, among other topics, all with an eye to exposing our assumptions about the nature of randomness and our abilities to account for it.
This summary covers all of Taleb’s major claims as well as many of his real-world examples, but it doesn’t attempt to recreate his energetic prose or his sense of humor. For those readers who want to experience Taleb’s unique writing voice and personality, we urge you to turn to the book itself.
PDF Summary Chapter 1: What Is a Black Swan?
...
Platonicity and the Platonic Fold
Like Plato, with his abstract and ideal “forms,” human beings in general tend to favor neat, “pure” concepts that are universally consistent. These concepts—mathematical rules, notions of historical progress, economic laws—allow us to form models of the world so that predictions are much easier to make.
The problem with these models is that they lead us to “mistake the map for the territory”—that is, we are fooled into thinking the models are reality, rather than a very particular representation of reality that excludes outliers (i.e., Black Swans).
Taleb calls this natural human tendency to box in reality Platonicity and he holds it responsible for our dangerous confidence in our own knowledge. We become so enamored of our elegant, self-consistent models that we are unable to see beyond them.
It’s where our models cease to be useful that Black Swans occur—in the Platonic fold between our predictive models and unpredictable reality.
The Origins of Taleb’s Black Swan Obsession
Taleb’s first encounter with a Black Swan took place in his home country of Lebanon.
For centuries, the region around Mount Lebanon was...
PDF Summary Chapter 2: Scalability | Mediocristan and Extremistan
...
The Contrary Worlds of Mediocristan and Extremistan
“Mediocristan” is Taleb’s term for the facets of our experience that are nonscalable. For example, like the income of a massage therapist, human physical traits such as height and weight hail from Mediocristan—they have upper and lower bounds, and if you were to graph every human’s height and weight, you would produce a bell curve.
Mediocristan’s overriding law can be stated thus: Given a large-enough sample size, no individual event will have a significant effect on the total. That is, there will be outliers—extremely heavy or tall people—but those outliers (1) will not be exponentially larger or smaller than the average, and (2) will be rendered insignificant by the sheer number of average cases. Most physical phenomena—human footspeed, trees’ rate of growth—come from Mediocristan. (Shortform note: Taleb sometimes treats Mediocristan as a distinct place, other times as an adjective to describe certain kinds of phenomena.)
“Extremistan,” oppositely, describes those facets of our experience that are eminently scalable. **In Extremistan, inequalities are vast enough that one instance can profoundly affect the...
What Our Readers Say
This is the best summary of The Black Swan I've ever read. I learned all the main points in just 20 minutes.
Learn more about our summaries →PDF Summary Chapter 3: Don’t Be a Turkey | It Pays to Be a Skeptic
... Traits of the Empirical (a-Platonic) Skeptic Traits of the Platonifier </tr> Respects those who say “I don’t know” Views those who say “I don’t know” as ignorant Thinks of Black Swans as a primary incidence of randomness Thinks of minor deviations as the primary incidence of randomness Minimizes theory Praises theory Assumes the world functions like Extremistan rather Mediocristan Assumes the world functions like Mediocristan rather than Extremistan Prefers to be broadly right across a wide range of disciplines and situations Prefers to be perfectly right in a narrow range of disciplines and situations </table>
<!--SSMLContent
We'll go over typical traits that define the platonifier and the empirical skeptic
Empirical skeptic: Views those who say “I don’t know” as ignorant
Platonifier: Respects those who say “I don’t know”
Empirical skeptic: Thinks of minor...
PDF Summary Chapter 4: The Scandal of Prediction
...
Taleb calls our overconfidence in our knowledge “epistemic arrogance.” On the one hand, we overestimate what we know; on the other, we underestimate what we don’t—uncertainty.
It’s important to recognize that Taleb isn’t talking about how much or how little we actually know, but rather the disparity between what we know and what we think we know. We’re arrogant because we think we know more than we actually do.
This arrogance leads us to draw a distinction between “guessing” and “predicting.” Guessing is when we attempt to fill in a nonrandom variable based on incomplete information, whereas predicting is attempting to fill in a random variable based on incomplete information.
Say, for example, someone asks you to estimate how many natural lakes there are in Georgia. There’s a right answer to the question—it’s 0—but you don’t know it, so your answer is a “guess.”
But say that same someone asks you what the U.S. unemployment rate will be in a year. You might look at past figures, GDP growth, and other metrics to try and make a “prediction.” But the fact is, your answer will still be a “guess”—there are just too many factors (unknown unknowns) to venture...
PDF Summary Chapter 5: Why We Can’t Know What We’ll Know
...
Popper’s theory is echoed by a key law in statistics called “the law of iterated expectations.” The law describes a situation when to predict something means to already know that something.
Think, for example, of the invention of the wheel. If a primitive human were to predict the invention of the wheel, that human would already know enough about the wheel to invent it him- or herself.
The same condition applies to contemporary predictions of discoveries. For example, some have predicted that carbon-capture technology will solve global warming. But to make that prediction, one has to know the specifications of that technology—that is, one has to know already how to create it (and thus how to stop global warming now). In other words, if we don’t know exactly how a certain technology will be created, we can’t make the prediction that the technology will be created.
Poincaré’s Nonlinearities
Henry Poincaré, arguably the most highly regarded mathematician of the late 19th-century in France, contributed to the theory of unpredictability by proposing the existence of nonlinearities—small phenomena that, as time goes on, can have profound...
PDF Summary Chapter 6: Predicting the Past
...
Our Information Is Always Incomplete
Mathematicians and philosophers draw a distinction between “true randomness” and “deterministic chaos.” A “random” system is one whose operation is always and forever impossible to predict. A “chaotic” system is one whose operation could be predicted, but whose complexity makes prediction so difficult that it’s effectively impossible. That is, if we had the right information, we would be able to make sense of “chaos.”
Taleb notes that, for normal people trying to make predictions about their stock portfolio, for example, or the appreciation of the value of their house, there’s no difference between “true randomness” and “deterministic chaos.” That’s because when we’re faced with a dynamical system, we always lack the necessary information to decide whether it’s truly random or simply chaotic.
Such is the case with history as well: Perhaps each major historical event conforms to some incredibly complex plan—that is, perhaps history is just chaotic, not random—but we’ll never have enough information to discern that plan.
The Proper Use of History
Historical narratives are harmless if understood properly: as windows...
PDF Summary Chapter 7: What to Do When You Can’t Predict
...
The “barbell strategy” is designed to minimize the pain of a negative Black Swan while, potentially, reaping a positive Black Swan’s benefits. If the market collapses, a person pursuing this strategy isn’t hurt beneath the “floor” of the safe investments (say, 85%), but if the market explodes, he has a chance to capitalize by virtue of the speculative bets.
(Shortform note: There is, of course, the possibility that a Black Swan will affect even the safest investment—in fact, if we take Taleb at his word, there is no such thing as a safe investment.)
4) Distinguish Between Positive Contingencies and Negative Ones
Different areas of society have different exposure to Black Swans, both positive and negative.
For example, scientific research and moviemaking are “positive Black Swan areas”—catastrophes are rare, and there is always the possibility of smashing success. The stock market or catastrophe insurance, meanwhile, are “negative Black Swan areas”—upsides are relatively modest compared to the possibility of financial ruin.
It’s important to note that the history of a negative–Black Swan area will underrepresent the possibility of catastrophe. An obvious...
Why are Shortform Summaries the Best?
We're the most efficient way to learn the most useful ideas from a book.
Cuts Out the Fluff
Ever feel a book rambles on, giving anecdotes that aren't useful? Often get frustrated by an author who doesn't get to the point?
We cut out the fluff, keeping only the most useful examples and ideas. We also re-organize books for clarity, putting the most important principles first, so you can learn faster.
Always Comprehensive
Other summaries give you just a highlight of some of the ideas in a book. We find these too vague to be satisfying.
At Shortform, we want to cover every point worth knowing in the book. Learn nuances, key examples, and critical details on how to apply the ideas.
3 Different Levels of Detail
You want different levels of detail at different times. That's why every book is summarized in three lengths:
1) Paragraph to get the gist
2) 1-page summary, to get the main takeaways
3) Full comprehensive summary and analysis, containing every useful point and example
PDF Summary Appendix: The Contours of Extremistan
...
The phenomenon identified by Merton’s study has been called both “cumulative advantage” and “preferential attachment.” These concepts describe the innate human tendency to flock to past successes, regardless of whether those successes are the product of merit or chance.
Although cumulative advantage/preferential attachment provides a better account of unfairness than the superstar effect—because it accounts for randomness in the distribution of advantage/preference—it still isn’t a perfect theory of Extremistan unfairness. This is because, according to cumulative advantage/preferential attachment, winners stay on top. Once an entity—a company, an academic, a professional sports coach—reaches a certain level of success, cumulative advantage/preferential attachments holds that that entity will continue to be successful, because humans naturally favor past success. But this doesn’t reflect reality.
For example, according to cumulative advantage/preferential attachment, Apple will forever be the king of consumer electronics, and Google will forever own the Internet. But even a cursory knowledge of business history shows a belief like this to be misguided. Consider this: If...