by Nassim Nicholas Taleb, Random House, New York, 2007
I am interested in how to live in a world we don’t understand very well –in other words, while most human thought has focused us on how to turn knowledge into decisions, I am interested in how to turn lack of information, lack of understanding, and lack of “knowledge” into decisions.
What better example of a Black Swan
than our current economic crisis.
Perhaps: Man keeps looking for a truth that fits his reality.
Given our reality, the truth doesn’t fit.
I must admit, I was about ready to give up mid-way through and then I got to page 225. The book was published in 2007. Our current economic crisis began to evolve after the material on and following page 225 was written. It doesn’t provide the cause, but it does define the risk. More important it argues for better ways to assess risk and plan for it.
P 44 – footnote
The tragedy of capitalism is that since the quality of the returns is not observable from past data, owners of companies, namely shareholders, can be taken for a ride by the managers who show returns and cosmetic profitability but in fact might be taking hidden risks.
In general, positive Black Swans take time to show their effect while negative ones happen very quickly—it is much easier and much faster to destroy than to build.
… awareness of a problem does not mean much—particularly when you have special interests and self-serving institutions in play. [This one applies to both current economics and healthcare reform.]
… this book was … written … by a practitioner whose principal aim is to not be a sucker in things that matter, period. … I am not advocating total risk phobia. … all I will be showing you in this book is how to avoid crossing the street blindfolded. … I have just presented the Black Swan problem in its historical form: the central difficulty of generalizing from available information, or of learning from the past, the known, and the seen.
Unless we concentrate very hard, we are likely to unwittingly simplify the problem because our minds routinely do so without our knowing it. [Said another way: Every significant problem has at least one obvious and simple solution that is wrong.]
… we do not have to be complete skeptics, just semiskeptics.
Once again, I am not dismissing the idea of risk taking … I am only critical of the encouragement of uninformed risk taking… The next few chapters will show in more depth how we tend to dismiss outliers and adverse outcomes when projecting the future… We have been playing Russian roulette; now let’s stop and get a real job.
… We have to accept the fuzziness of the familiar “because” no matter how queasy it makes us feel … I repeat that we are explanation seeking animals who tend to think that everything has an identifiable cause and grab the most apparent one as the explanation. Yet there may not be a visible because to the contrary, frequently there is nothing, not even a spectrum of possible explanations.
Note there that I am not saying causes do not exist; do not use this argument to avoid trying to learn from history. All I am saying is that it is not so simple; be suspicious of the “because” and handle it with care…
… The problem with experts is they do not know what they do not know. Lack of knowledge and delusion about the quality of your knowledge come tougher—the same process that makes you know less also makes you satisfied with your knowledge.
[By the way, I am capturing “sound bites” here. The book also includes antidotes, examples, research results, and analysis. That is harder to capture. For that, you need to read the book.]
…economic forecasters tend to fall closer to one another than to the resulting [actual] outcome. Nobody wants to be off the wall.
We humans are the victim of an asymmetry in the perceptions of random events. We attribute our successes to our skill, and our failures to external events outside our control, namely to randomness. We feel responsible for the good stuff, but not for the bad. This causes us to think that we are better than others at whatever we do for a living. Ninety-four percent of Swedes believe that their driving skills put them in the top 50 percent of Swedish drivers; 84% of Frenchmen feel that their lovemaking abilities put them in the top half of French lovers.
I know that history is going to be dominated by an improbable event, I just don’t know what that event will be.
Plans fail because of what we have called tunneling, the neglect of sources of uncertainty outside the plan itself.
Trial and error means trying a lot … we have psychological and intellectual difficulties with trial and error, and with accepting that series of small failures are necessary in life. … we humans have a mental hang-up about failures: “You need to love to lose.” … America’s specialty is to take these small risks for the rest of the world, which explains this country’s disproportionate share innovations.
… if you accept that most “risk measures are” flawed, because of the Black Swan, then your strategy is to be as hyper conservative and hyper aggressive as you can be instead of being mildly aggressive or conservative. … you have high risk on one side and no risk on the other. The average will be medium risk but constitutes a positive exposure to the Black Swan [Minimize risk on the down side and be very aggressive on the upside; get the Black Swan on the upside.]
All of these recommendations have one point in common: asymmetry. Put yourself in situations where favorable consequences are much larger than unfavorable. … focus on the consequences (which you can know) rather than the probability (which you can’t know)
We are gliding into disorder, but no necessarily bad disorder. This implies that we will see more periods of calm and stability, with most problems concentrated into a small number of Black Swans. …
Globalization … is here, but it is not all for the good: it creates interlocking fragility, while reducing volatility and giving the appearance of stability. In other words it creates devastating Black Swans. We have never lived before under the threat of global collapse. Financial institutions have been merging into a smaller number of very large banks. Almost all banks are now interrelated. So the financial ecology is swelling into gigantic, incestuous, bureaucratic banks—when one falls, they all fall. The increased concentration among banks seems to have the effect of making financial crisis less likely, but when they happen, they are more global in scale and hit us very hard. We have moved from a diversified ecology of small banks, with varied lending policies, to a more homogeneous framework of firms that all resemble one another. True, we now have fewer failures, but when they occur … [author’s dots]
… banks are in a far worse situation [when something fails] than the Internet. … We would be far better off if there were a different ecology, in which financial institutions went bust on occasion and were rapidly replaced by new ones, thus mirroring the diversity of Internet businesses and the resilience of the Internet economy.
[Written before publication in 2007 and well before the current economic crisis hit. If governments, or any other organization, is to assess risk and minimize its impact we need ways of thinking that were essentially non-existent in the first half of the decade. “Man keeps searching for a truth that fits his reality. Given his reality, the truth doesn’t fit.” Taleb may have defined the reality of risk in a way that we can create a “truth” about it and how to deal with it in a way that will fit.]
Nassim Taleb on CNBC (8/12/2009)