Las Vegas Review-Journal

Upfront analysis beats postmortem explanatio­ns

- David Leonhardt

The Israeli intelligen­ce service asked the great psychologi­st Daniel Kahneman for help in the 1970s, and Kahneman came back with a suggestion: Get rid of the classic intelligen­ce report. It allows leaders to justify any conclusion they want, Kahneman said. In its place, he suggested giving the leaders estimated probabilit­ies of events.

The intelligen­ce service did so, and an early report concluded that one scenario would increase the chance of full-scale war with Syria by 10 percent. Seeing the number, a top official was relieved. “Ten percent increase?” he said. “That is a small difference.”

Kahneman was horrified (as Michael Lewis recounts in his book “The Undoing Project”). A 10 percent increase in the chance of catastroph­ic war was serious. Yet the official decided that 10 wasn’t so different from zero.

Looking back years later, Kahneman said: “No one ever made a decision because of a number. They need a story.”

His change of heart is a good way to introduce my ritual self-criticism. There is a burgeoning tradition in which columnists devote a year-end column to the errors of our ways. The journalist Dave Weigel calls it “pundit accountabi­lity.”

I’ll start with some back story: Like the pre-1970s Israeli army, the news business of old didn’t have much use for probabilit­ies, outside of the weather report. These days, though, probabilit­ies pop up all over.

At 10 p.m. on Alabama’s recent election night, The New York Times said that Doug Jones had roughly a 70 percent chance of winning, based on counted votes. (That scoreboard drew 13 million views.) Likewise, the financial media reports recession odds, and sports websites publish realtime win probabilit­ies.

I’m a probabilit­y advocate. In previous jobs, I have helped create election scoreboard­s. Probabilit­ies are more meaningful than safe “anything can happen” platitudes, vague “it’s likely” analyses or artificial­ly confident guarantees.

But I’ve come to realize that I was wrong about a major aspect of probabilit­ies.

They are inherently hard to grasp. That’s especially true for an individual event, like a war or election. People understand that if they roll a die 100 times, they will get some 1’s. But when they see a probabilit­y for one event, they tend to think: Is this going to happen or not?

They then effectivel­y round to 0 or to 100 percent. That’s what the Israeli official did. It’s also what many Americans did when they heard Hillary Clinton had a 72 percent or 85 percent chance of winning. It’s what football fans did in the Super Bowl when the Atlanta Falcons had a 99 percent chance of victory.

And when the unlikely happens, people scream: The probabilit­ies were wrong!

Usually, they were not wrong. The screamers were wrong.

I used to believe that the best response was explanatio­n and context. After all, people understand that many outcomes with long odds do happen. “Just because it’s rare,” says the medical expert H. Gilbert Welch, “doesn’t mean it doesn’t happen.” You draw an ace (8 percent). A random baby girl grows up to be at least 5’9” (6 percent). New York has a white Christmas (11 percent). In my computer, I’ve got a long list of these unlikely events.

But I now think explanatio­n is doomed to fail. For an individual event, people can’t resist saying that a probabilit­y was “right” if it was above 50 percent and “wrong” if it was below 50 percent. When this happens, those of us who believe in probabilit­ies can’t just shake our heads and mutter about white Christmase­s. We have to communicat­e more effectivel­y.

I think part of the answer lies with Kahneman’s insight: Human beings need a story.

It’s not enough to say an event has a 10 percent probabilit­y. People need a story that forces them to visualize the unlikely event — so they don’t round 10 to zero.

Imagine that a forecast giving Candidate X a 10 percent chance included a prominent link, “How X wins.” It would explain how the polling could be off and include a winning map for X. It would all but shout: This really may happen.

Welch, a Dartmouth professor, pointed me to an online pictograph about breast-cancer risk. It shows 1,000 stick figures, of which 973 are gray (no cancer), 22 are yellow (future survivor) and 5 are red (die in next 10 years). You can see the most likely outcome without ignoring the others.

Yes, I understand that ideas like this won’t eliminate confusion. But even modest progress would be worthwhile.

The rise of big data means that probabilit­ies are becoming a larger part of life. And our misunderst­andings have real costs. Obama administra­tion officials, to take one example, might have treated Russian interferen­ce more seriously if they hadn’t rounded Donald Trump’s victory odds down to almost zero. Alas, unlike a dice roll, the election is not an event we get to try again.

David Leonhardt is a columnist for The New York Times.

Newspapers in English

Newspapers from United States