Nonrival readers did not expect Elon Musk to lay off half of Twitter. A third maybe. But half? It’d wreck the company. Most Nonrival readers put the chances at 15% or lower. But he did it.
What can we learn when a forecast doesn't go the way we thought? And how do we even know if we were wrong in the first place? If you say there's a 25% chance of two consecutive coin flips coming up heads, the fact that it then happens doesn't mean you were wrong.
Don Moore is a psychologist at Berkeley who studies overconfidence, and the author of the excellent book Perfectly Confident. This week I asked Don for his thoughts on learning from cases that don't turn out the way you predicted.
Our interview is below:
Nonrival: What should a good decision maker or forecaster do when something turns out differently than they expected?
Don Moore: After you learn an outcome, whether you were right or wrong, it's worth going back and asking what (if anything) you have to learn. It's not as simple as patting yourself on the back if you're right and resolving to do better if you were wrong. The goal is to learn what you can generalize and apply to the next forecast. There are a couple of dangerous biases that make this difficult.
The first danger is what poker players like Annie Duke call "resulting": you judge a decision based exclusively on its outcome. In truth, you should judge the decision based on your effective use of the information you had at the time you made the decision. That leads you to the second danger: the hindsight bias will predispose you to selectively recall all the reasons why the observed outcome was likely, and make it harder for you to remember the contrary information.
In most complex phenomena, there is an element of irreducible uncertainty. It's easiest to see in chance devices like a coin flip. It would be a mistake to beat yourself up for predicting heads when the coin comes up tails. Should you have known that it would have come up tails? No. There was only a 50% chance of tails. And no matter how many coin flips you observe, you can't predict the next one with certainty greater than 50%. The irreducible uncertainty can't be eliminated. Expect it and factor it in. Your goal should be understanding as much of the rest as you can.
Given that risk of "resulting," how do you assess the possibility that you did misjudge something?
Here I think of Maria Konnikova's book, "The Biggest Bluff." She was recounting a poker hand to her coach, and he stopped her before she disclosed who won the hand. He wanted her to explain what she knew when she bet, and evaluate her decision based on what she knew at that time. The analogy in forecasting would be to attempt to explain your forecast using what you knew at the time. Can you justify that to someone else (ideally someone who doesn't know the actual outcome)?
Your work and your book Perfectly Confident point to how most of us suffer from overconfidence. Is the lesson from bad forecasts or decisions just to be more uncertain?
We could all benefit from a dose of humility. We're all vulnerable to being too sure that we know what's going to happen. The confidence we assign to our forecasts usually exceeds their accuracy. How come? Because the world surprises us with unknown unknowns. That is, we will sometimes be wrong for reasons we fail to anticipate. To pick one dramatic example, forecasts for US unemployment rates in the second quarter of 2020 look recklessly overconfident because forecasters did not anticipate the Covid-19 pandemic. Our forecasts will always be vulnerable to big shocks like that, and so good calibration demands that we adjust downward our confidence, especially when we realize we don't know everything and we are vulnerable to being surprised. Nassim Taleb has argued that "black swan" events like that render forecasting worthless. The situation isn't quite that grim. Forecasts are useful. In fact, they're essential. Every decision depends on a forecast of its likely consequences. But those forecasts and those decisions will be better if they are made with appropriate humility.