The Black Swan: The Impact of the Highly Improbable
Wonderful book. Makes me think probability in another perspective
A small number of Black Swans explain almost everything in our world, from the success of ideas and religions, to the dynamics of historical events, to elements of our own personal lives.
Contrary to social-science wisdom, almost no discovery, no technologies of note, came from design and planning—they were just Black Swans.
The strategy for the discoverers and entrepreneurs is to rely less on top-down planning and focus on maximum tinkering and recognizing opportunities when they present themselves.
Everybody knows that you need more prevention than treatment, but few reward acts of prevention.
The Black Swan comes from our misunderstanding of the likelihood of surprises, those unread books, because we take what we know a little too seriously.
The human mind suffers from three ailments as it comes into contact with history,
the triplet of opacity. They are: the illusion of understanding, or how everyone thinks
the illusion of understanding, or how everyone thinks he knows what is going on in a world that is more complicated (or random) than they realize; the retrospective distortion, or how we can assess matters only after the fact, as if they were in a rearview mirror (history seems clearer and more organized in history books than in empirical reality); and the overvaluation of factual information and the handicap of authoritative and learned people, particularly when they create categories—when they “Platonify.”
Our minds are wonderful explanation machines, capable of making sense out of almost anything, capable of mounting explanations for all manner of phenomena, and generally incapable of accepting the idea of unpredictability.
These events were unexplainable, but intelligent people thought they were capable of providing convincing explanations for them—after the fact. Furthermore, the more intelligent the person, the better sounding the explanation. What’s more worrisome is that all these beliefs and accounts appeared to be logically coherent and devoid of inconsistencies.
Of the millions, maybe even trillions, of small facts that prevail before an event occurs, only a few will turn out to be relevant later to your understanding of what happened.
Nobody knew anything, but elite thinkers thought that they knew more than the rest because they were elite thinkers,
A scalable profession is good only if you are successful; they are more competitive, produce monstrous inequalities, and are far more random, with huge disparities between efforts and rewards—a few can take a large share of the pie, leaving others out entirely at no fault of their own.
The inequity comes when someone perceived as being marginally better gets the whole pie.
It is hard for us to accept that people do not fall in love with works of art only for their own sake, but also in order to feel that they belong to a community.
When your sample is large, no single instance will significantly change the aggregate or the total.
With Mediocristan-style randomness it is not possible to have a Black Swan surprise such that a single event can dominate a phenomenon.
Matters that seem to belong to Mediocristan (subjected to what we call type 1 randomness): height, weight, calorie consumption, income for a baker, a small restaurant owner, a prostitute, or an orthodontist; gambling profits (in the very special case, assuming the person goes to a casino and maintains a constant betting size), car accidents, mortality rates, “IQ” (as measured).
Matters that seem to belong to Extremistan (subjected to what we call type 2 randomness): wealth, income, book sales per author, book citations per author, name recognition as a “celebrity,” number of references on Google, populations of cities, uses of words in a vocabulary, numbers of speakers per language, damage caused by earthquakes, deaths in war, deaths from terrorist incidents, sizes of planets, sizes of companies, stock ownership, height between species (consider elephants and mice), financial markets (but your investment manager does not know it), commodity prices, inflation rates, economic data.
Mediocristan is where we must endure the tyranny of the collective, the routine, the obvious, and the predicted; Extremistan is where we are subjected to the tyranny of the singular, the accidental, the unseen, and the unpredicted.
Mistaking a naïve observation of the past as something definitive or representative of the future is the one and only cause of our inability to understand the Black Swan.
Positive Black Swans take time to show their effect while negative ones happen very quickly—it is much easier and much faster to destroy than to build.
Statisticians, it has been shown, tend to leave their brains in the classroom and engage in the most trivial inferential errors once they are let out on the streets.
A thousand days cannot prove you right, but one day can prove you to be wrong.
The narrative fallacy addresses our limited ability to look at sequences of facts without weaving an explanation into them, or, equivalently, forcing a logical link, an arrow of relationship, upon them.
It takes considerable effort to see facts (and remember them) while withholding judgment and resisting explanations.
We, members of the human variety of primates, have a hunger for rules because we need to reduce the dimension of matters so they can get into our heads.
Conventional wisdom holds that
mere absence of nonsense may not be sufficient to make something true.
Patients who spend fifteen minutes every day writing an account of their daily troubles feel indeed better about what has befallen them. You feel less guilty for not having avoided certain events; you feel less responsible for it. Things appear as if they were bound to happen.
After a candidate’s defeat in an election, you will be supplied with the “cause” of the voters’ disgruntlement. Any conceivable cause can do.
We feel the sting of man-made damage far more than that caused by nature.
Our misunderstanding of the Black Swan can be largely attributed to our using System 1,
The way to avoid the ills of the narrative fallacy is to favor experimentation over storytelling, experience over history, and clinical knowledge over theories.
Many people labor in life under the impression that they are doing something right, yet they may not show solid results for a long time.
Modern reality rarely gives us the privilege of a satisfying, linear, positive progression: you may think about a problem for a year and learn nothing; then, unless you are disheartened by the emptiness of the results and give up, something will come to you in a flash.
Plenty of mildly good news is preferable to one single lump of great news.
It is unfortunate that the right strategy for our current environment may not offer internal rewards and positive feedback.
One of the attributes of a Black Swan is an asymmetry in consequences—either positive or negative.
The hippocampus takes the insult of chronic stress seriously, incurring irreversible atrophy. Contrary to popular belief, these small, seemingly harmless stressors do not strengthen you; they can amputate part of your self.
Humans will believe anything you say provided you do not exhibit the smallest shadow of diffidence; like animals, they can detect the smallest crack in your confidence before you express it.
The graveyard of failed persons will be full of people who shared the following traits: courage, risk taking, optimism, et cetera. Just like the population of millionaires. There may be some differences in skills, but what truly separates the two is for the most part a single factor: luck.
We see the obvious and visible consequences, not the invisible and less obvious ones. Yet those unseen consequences can be—nay, generally are—more meaningful.
Around twenty-five hundred people were directly killed by bin Laden’s group in the Twin Towers of the World Trade Center. Their families benefited from the support of all manner of agencies and charities, as they should. But, according to researchers, during the remaining three months of the year, close to one thousand people died as silent victims of the terrorists. How? Those who were afraid of flying and switched to driving ran an increased risk of death. There was evidence of an increase of casualties on the road during that period; the road is considerably more lethal than the skies. These families got no support—they did not even know that their loved ones were also the victims of bin Laden.
Our neglect of silent evidence kills people daily. Assume that a drug saves many people from a potentially dangerous ailment, but runs the risk of killing a few, with a net benefit to society. Would a doctor prescribe it? He has no incentive to do so. The lawyers of the person hurt by the side effects will go after the doctor like attack dogs, while the lives saved by the drug might not be accounted for anywhere.
A life saved is a statistic; a person hurt is an anecdote. Statistics are invisible; anecdotes are salient. Likewise, the risk of a Black Swan is invisible.
The illusion of stability. The bias lowers our perception of the risks we incurred in the past, particularly for those of us who were lucky to have survived them. Your life came under a serious threat but, having survived it, you retrospectively underestimate how risky the situation actually was.
Evolution is a series of flukes, some good, many bad. You only see the good. But, in the short term, it is not obvious which traits are really good for you, particularly if you are in the Black Swan–generating environment of Extremistan.
Whenever your survival is in play, don’t immediately look for causes and effects. The main identifiable reason for our survival of such diseases might simply be inaccessible to us: we are here since, Casanova-style, the “rosy” scenario played out, and if it seems too hard to understand it is because we are too brainwashed by notions of causality and we think that it is smarter to say because than to accept randomness.
The military collected more genuine intellects and risk thinkers than most if not all other professions.