In an Edge-seminar held by Tetlock recently there was an interesting back and forth about the value of predictions. One view – held by Danny Hillis – could be simplified as: people do not think in predictions, they think in stories and so when people are wrong about the future they are in fact not wrong about predictions, but about what story we are in. This is profound in two ways: if it is true, probabilistic predictions are not a good way to become better at dealing with the future, unless you use them as a tool to craft an alternative story.
This suggest a really simple, but probably impactful, intervention for anyone serious considering the future of an issue, a company or a particular system: uncover the main storyline, and then appoint a small team to be “red team storytellers” who tell stories that are compatible with the evidence but arrange it differently. The idea flashes across in this dialogue between a few of the seminar’s participants:
Hillis: There’s another dimension to it, which is that the very act of taking evidence and using that to adjust your prediction demands a framework of explanation. You have an operative story that you think, “Okay, well, this is how Hillary is going to get elected.” If you look at the failure of the intelligence community in the case of WMD, it was a failure of storytelling, not so much a failure of evidence interpretation. Retrospectively, you could tell a story of Saddam deliberately pretending like he had weapons of mass destruction, there’s a story about his generals lying to him; none of those stories were told beforehand.
Had those stories been told beforehand, the evidence could have been interpreted differently. The same evidence could have been interpreted differently. In fact, that’s part of what I was saying about the job of the intelligence community in storytelling is providing the frameworks in which you can interpret evidence and make predictions.
Tetlock: Right, right.
Brand: So you want red team storytellers.
Hillis: Well, yes.
Tetlock: Part of the official operating procedure of the CIA and officials is to have that. In the case of Iraq WMD, that process broke down. You had the situation where the Director of the CIA did say to the President of the United States, “It’s a slam-dunk, Mr. President.”
Stewart Brand’s notion of red team storytellers captures something that has been worrying me about prediction markets and probabilistic reasoning: it assumes that the propositions or predictions make sense in themselves, when, in fact, we need to integrate them into a narrative to understand them.
This in no way means predictions are useless. They can be used to challenge the main narrative that you are working with. What predictions would have to be true for this to be the case?
Important ground work then would be laying out the main narrative in the business you are in, the most naive and most common story about why you are where you are. Then start to look for cracks.