Humans are great at prediction

Humans are terrible at making forecasts, we’re often told. Here’s one recent example at Bloomberg View:

I don’t mean to pick on either of those folks; you can randomly name any 10 strategists, forecasters, pundits and commentators and the vast majority of their predictions will be wrong. Not just a little wrong, but wildly, hilariously off.

The author is talking specifically about the economy, and I mostly agree with what I think he’s trying to say. But I’m tired of this framing:

Every now and again, it is worth reminding ourselves just how terrible humans are at predicting what will happen in markets and/or the economy.

Humans are amazing at predicting the future, and yes that includes what will happen in the economy. It’s just that when we sit down to talk about forecasting, for some reason we decide to throw out all the good predictions, and focus on the stuff that’s just hard enough to be beyond our reach.

There are two main avenues through which this happens. The first is that we idolize precision, and ignore the fact that applying a probability distribution to a range of possibilities is a type of prediction. So the piece above is right that it’s incredibly difficult for an economist to predict exactly the number of jobs that will be added in a given month. But experts can assign probabilities to different outcomes. They can say with a high confidence, for example, that the unemployment rate for August will be somewhere between say 5.5% and 6.5%.

You might think that’s not very impressive. But it’s a prediction, and a useful one. The knowledge that the unemployment rate is unlikely to spike over any given month allows businesses to feel confident in making investments, and workers to feel confident making purchases. I’m not saying we’re perfect at this probabilistic approach — recessions still surprise us. But it’s a legitimate form of prediction at which we do far better than random.

That example leads me to the second way in which we ignore good predictions. Talk of how terrible we are at forecasting ignores the “easy” cases. Will the sun rise tomorrow? Will Google still be profitable in a week? Will the price of milk triple over the next 1o days? We can answer these questions fairly easily, with high confidence. Yes, they seem easy. But they seem easy precisely because human knowledge and the scientific process have been so successfully incorporated into modern life.

And there are plenty of other predictions between these easy cases and the toughest ones that get thrown around. If you invest in the stock market for the long-term, you’re likely to make money. Somewhere around a third of venture-backed startups won’t make it to their 10th birthday. A few years down the line, today’s college graduates will have higher wages on average than their peers without a degree. None of these things are certain. But we can assign probabilities to them that would exceed that of a dart-throwing chimp. Perhaps you’re not impressed, but to me this is the foundation of modern society.

None of this is to say we shouldn’t hold pundits and experts more accountable for their bad predictions, or that we shouldn’t work to improve our predictions where possible. (And research suggests such improvement is often possible.)

But let’s not lose sight of all the ways in which we excel at prediction. Forecasting is a really broad category of thinking that is at the center of modern science. And compared to our ancestors, we’re pretty good at it.



After the rapture

What will the response be of people like this when Judgement Day / the rapture doesn’t arrive tomorrow?

Here’s a clue:

“A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.” So wrote the celebrated Stanford University psychologist Leon Festinger(PDF), in a passage that might have been referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger was actually describing a famous case study in psychology.

Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, “Sananda,” who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.

Through her, the aliens had given the precise date of an Earth-rending cataclysm: December 21, 1954. Some of Martin’s followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States. The disciples even went so far as to remove brassieres and rip zippers out of their trousers—the metal, they believed, would pose a danger on the spacecraft.

Festinger and his team were with the cult when the prophecy failed. First, the “boys upstairs” (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?

At first, the group struggled for an explanation. But then rationalization set in. A new message arrived, announcing that they’d all been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” Their willingness to believe in the prophecy had saved Earth from the prophecy!

From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. “Their sense of urgency was enormous,” wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.

That’s from Chris Mooney’s excellent article on the science of denial.