continuing our series in which we consider the extent to which modern finance has been periodically reshaped by misreadings and misunderstandings of Nassim Taleb’s books)
After looking at “Fooled By Randomness” and the barbell portfolio, we now come to what looked like it was going to be the easiest part of this series when I started planning it – “The Black Swan”. It is quite comically easy to see what people misunderstood here, because we are given a reminder every December, when sell-side research firms publish their annual lists of “Twelve Black Swans For 2023”.
Or is it?
Well, to an extent, yes it is. I can’t find the clip (perhaps it’s never been posted on YouTube), but I remember watching an interview on CNBC, in a hotel room somewhere, where the host, having floundered through a discussion of the themes of the book, decided to wrap things up by asking how he converted them into “actionable equity ideas” and whether he could give their viewers a few tips. I’ve never seen quite such a look of weary disgust and must confess I giggled.
Ever since then, I’ve wished that every single interview with anyone on television – politicians, chefs, Nobel prizewinners, darts champions – should conclude with “and finally, best European or North American ideas, long or short”. Trying to take a book like TBS and turn it into a list of stock calls is, intrinsically, a ridiculous thing to do.
But on the other hand, the more I thought about it while writing this piece, the less sure I got that “a list of Black Swans” is necessarily a contradiction in terms. Let’s look at two of the key examples of a Black Swan risk, from chapter 9 (“The Ludic Fallacy”). This is in the section where Taleb is describing a visit to a casino, and making the point that the biggest risks to it are nothing to do with the probability of loss associated with the everyday risk taking that forms the business of a casino.
“First, they lost around $100 million when an irreplaceable performer in their main show was maimed by a tiger … The tiger had been reared by the performer and even slept in his bedroom; until then, nobody suspected that the powerful animal would turn against its master. In scenario analyses, the casino had even conceived of the animal jumping into the crowd, but nobody came near to the idea of insuring against what happened
[…]
“Third, casinos must file a special form with the IRS documenting a gambler’s profit if it exceeds a given amount. The employee who was supposed to mail the forms hid them, instead, for completely unexplicable reasons, in boxes under his desk … the casino faced the near loss of a gambling licence … they ended up paying a monstrous fine.”
As Chris Rock said about the first event, of course, that tiger didn’t go crazy – that tiger went tiger. If you’ve considered and got insurance quotes for the possibility that it might become aggressive and jump into the audience, you’ve clearly got the mental equipment to conceive of what actually happened. And although Taleb does muddy the water a bit in the other example (he says “The employee’s refraining from sending the documents was truly impossible to predict”), it wasn’t really, was it? Things like that happen all the time and lots of companies have processes to check that paperwork like this has been filed on time. It just happened that this casino didn’t.
But that’s the key issue here. Thinking of Black Swans as intrinsically impossible to predict is itself an epistemic coping mechanism. More often, they are things that you could have predicted, but you didn’t.
This is, of course, much more frightening and disconcerting, just like the scariest horror movies are the ones with plausible human beings as villains rather than supernatural monsters. Taleb’s “Narrative Fallacy” is a particularly important part of his thinking, and it ensures that in most cases, a Black Swan risk is something that you can and will be blamed for, because once it’s happened it’s pretty obvious. Not only that, but the people blaming you for it will say things like “a tiger biting someone’s head? Not exactly a Black Swan, is it?” (I’ve certainly been guilty of this myself in the past!)
Black Swans happen because they are things which exist, but which aren’t part of someone’s mental representation of the world. (They exist among the facts which don’t have a counterpart “stylised fact”). And so those sell-side lists might be more useful than you’d think – they function as a sort of “have you thought about this?” for things that are in someone else’s mental model but might not be in yours.
Of course, the moment you pay attention to something like this, then it isn’t a Black Swan any more. And even more unfortunately, given limits on bandwidth and mental capacity, the choice to pay attention to one thing is inevitably a choice to not pay attention to something else. One of the main reasons why unpredictable, low-probability events have a disproportionate impact on people and systems is that they are the only kind of things which can have such an impact; if something is not a Black Swan, then it’s part of the ordinary course of business.