From Nassim Taleb in Antifragile:
There is something more severe than the problem with Thomas Friedman, which can be generalized to represent someone causing action while being completely unaccountable for his words.
The phenomenon I will call the Stiglitz syndrome, after an academic economist of the so-called “intelligent” variety called Joseph Stiglitz, is as follows.
Remember the fragility detection in Chapter 19 and my obsession with Fannie Mae. Luckily, I had some skin in the game for my opinions, be it through exposure to a smear campaign. And, in 2008, no surprise, Fannie Mae went bust, I repeat, costing the U.S. taxpayer hundreds of billions (and counting)—generally, the financial system, with similar risks, exploded. The entire banking system had similar exposures.
But around the same period, Joseph Stiglitz, with two colleagues, the Orszag brothers (Peter and Jonathan), looked at the very same Fannie Mae. They assessed, in a report, that “on the basis of historical experience, the risk to the government from a potential default on GSE debt is effectively zero.” Supposedly, they ran simulations—but didn’t see the obvious. They also said that the probability of a default was found to be “so small that it is difficult to detect.” It is statements like these and, to me, only statements like these (intellectual hubris and the illusion of understanding of rare events) that caused the buildup of these exposures to rare events in the economy. This is the Black Swan problem that I was fighting. This is Fukushima.
Now the culmination is that Stiglitz writes in 2010 in his I-told-you-so book that he claims to have “predicted” the crisis that started in 2007–2008.
Look at this aberrant case of antifragility provided to Stiglitz and his colleagues by society. It turns out that Stiglitz was not just a nonpredictor (by my standards) but was also part of the problem that caused the events, these accumulations of exposures to small probabilities. But he did not notice it! An academic is not designed to remember his opinions because he doesn’t have anything at risk from them.
At the core, people are dangerous when they have that strange skill that allows their papers to be published in journals but decreases their understanding of risk. So the very same economist who caused the problem then postdicted the crisis, and then became a theorist on what happened. No wonder we will have larger crises.
The central point: had Stiglitz been a businessman with his own money on the line, he would have blown up, terminated. Or had he been in nature, his genes would have been made extinct—so people with such misunderstanding of probability would eventually disappear from our DNA. What I found nauseating was the government hiring one of his coauthors.
I am reluctantly calling the syndrome by Stiglitz’s name because I find him the smartest of economists, one with the most developed intellect for things on paper—except that he has no clue about the fragility of systems. And Stiglitz symbolizes harmful misunderstanding of small probabilities by the economics establishment. It is a severe disease, one that explains why economists will blow us up again.
The Stiglitz syndrome corresponds to a form of cherry-picking, the nastiest variety because the perpetrator is not aware of what he is doing. It is a situation in which someone doesn’t just fail to detect a hazard but contributes to its cause while ending up convincing himself—and sometimes others—of the opposite, namely, that he predicted it and warned against it. It corresponds to a combination of remarkable analytical skills, blindness to fragility, selective memory, and absence of skin in the game.
Stiglitz Syndrome = fragilista (with good intentions) + ex post cherry-picking
There are other lessons here, related to the absence of penalty. This is an illustration of the academics-who-write-papers-and-talk syndrome in its greatest severity (unless, as we will see, they have their soul in it). So many academics propose something in one paper, then the opposite in another paper, without penalty to themselves from having been wrong in the first paper since there is a need only for consistency within a single paper, not across one’s career. This would be fine, as someone may evolve and contradict earlier beliefs, but then the earlier “result” should be withdrawn from circulation and superseded with a new one—with books, the new edition supersedes the preceding one. This absence of penalty makes them antifragile at the expense of the society that accepts the “rigor” of their results. Further, I am not doubting Stiglitz’s sincerity, or some weak form of sincerity: I believe he genuinely thinks he predicted the financial crisis, so let me rephrase the problem: the problem with people who do not incur harm is that they can cherry-pick from statements they’ve made in the past, many of them contradictory, and end up convincing themselves of their intellectual lucidity on the way to the World Economic Forum at Davos.
There is the iatrogenics of the medical charlatan and snake oil salesperson causing harm, but he sort of knows it and lies low after he is caught. And there is a far more vicious form of iatrogenics by experts who use their more acceptable status to claim later that they warned of harm. As these did not know they were causing iatrogenics, they cure iatrogenics with iatrogenics. Then things explode.
Finally, the cure to many ethical problems maps to the exact cure for the Stiglitz effect, which I state now.
Never ask anyone for their opinion, forecast, or recommendation. Just ask them what they have—or don’t have—in their portfolio.