Nassim Taleb and IPE Prof. Mark Blyth apply "Black Swans" to foreign policy and the Arab Spring:
Complex systems that have artificially suppressed volatility tend to become extremely fragile, while at the same time exhibiting no visible risks. In fact, they tend to be too calm and exhibit minimal variability as silent risks accumulate beneath the surface. Although the stated intention of political leaders and economic policymakers is to stabilize the system by inhibiting fluctuations, the result tends to be the opposite. These artificially constrained systems become prone to "Black Swans"--that is, they become extremely vulnerable to large-scale events that lie far from the statistical norm and were largely unpredictable to a given set of observers.
Such environments eventually experience massive blowups, catching everyone off-guard and undoing years of stability or, in some cases, ending up far worse than they were in their initial volatile state. Indeed, the longer it takes for the blowup to occur, the worse the resulting harm in both economic and political systems. ...
Take, for example, the recent celebrated documentary on the financial crisis, Inside Job, which blames the crisis on the malfeasance and dishonesty of bankers and the incompetence of regulators. Although it is morally satisfying, the film naively overlooks the tact that humans have always been dishonest and regulators have always been behind the curve. The only difference this time around was the unprecedented magnitude of the hidden risks and a misunderstanding of the statistical properties of the system. ...
Humans fear randomness--a healthy ancestral trait inherited from a different environment. Whereas in the past, which was a more linear world, this trait enhanced fitness and increased chances of survival, it can have the reverse effect in today's complex world, making volatility take the shape of nasty Black Swans hiding behind deceptive periods of "great moderation." This is not to say that any and all volatility should be embraced. Insurance should not be banned, for example.
But alongside the "catalysts as causes" confusion sit two mental biases: the illusion of control and the action bias (the illusion that doing something is always better than doing nothing). This leads to the desire to impose man-made solutions. Greenspans actions were harmful, but it would have been hard to justify inaction in a democracy where the incentive is to always promise a better outcome than the other guy, regardless of the actual, delayed cost.
I don't have too much to say about this right now, but I find the basic argument very interesting. (For those without institutional access to Foreign Affairs, Blyth describes the basic principles in this audio interview.) It's not especially new, especially for Taleb, but generalizing the argument that economies, polities, and other social systems are complex adaptive systems is important. The central claim -- the more we try to keep a lid on volatility, the bigger the inevitable explosion -- may or may not be strictly true. It's the sort of claim that needs more empirical support than they provide in this essay alone. But the bigger argument about complex systems is surely true, and internalizing it is important for social scientists and policy makers. I expect this sort of thinking to guide a lot of future research in the social sciences.