3 good lines on when to trust experts

When should we defer to experts, and on what grounds? My general view is More often than most people do. But it’s a complicated question; as Phil Tetlock has shown, not every credentialed expert is worthy of our deference, for example. And even if you want to defer, summarizing what the experts actually think can be a challenge.

But I’ve come across three separate tidbits, each related in some way to this question, and I thought I’d preserve them here.

Dan Kahneman on deference to scientific consensus

One reason you might not trust certain experts is the ongoing “crisis” in social science. That’s a subject for another post, but here’s the famed psychologist Dan Kahneman responding to a post challenging his belief in the literature on priming:

My position when I wrote “Thinking, Fast and Slow” was that if a large body of evidence published in reputable journals supports an initially implausible conclusion, then scientific norms require us to believe that conclusion. Implausibility is not sufficient to justify disbelief, and belief in well-supported scientific conclusions is not optional. This position still seems reasonable to me – it is why I think people should believe in climate change. But the argument only holds when all relevant results are published.

Kahneman writes that belief in scientific conclusions is “not optional” because, I believe, as a student of bias he realizes that the alternative is far worse. At least for most people, not believing what’s been published in peer-reviewed journals doesn’t mean a sort of enlightened skepticism, it means falling back on the hunches you’d always wanted to believe in the first place.

Fact checking and triangulating the truth

This one comes from Lucas Graves’ excellent new book Deciding What’s Trueabout modern fact-checking. I’m writing more about the book, but for now I want to highlight one phrase and a few sentences that surround it:

The need to consult experts presents a real problem for fact-checkers, pulling them inevitably into the busy national market of data and analysis retailed in the service of ideological agendas. “You’re going to seek independent, nonbiased sources — of which in today’s world there are non,” a PolitiFact editor joked during training…

PolitiFact items often feature analysis from experts or groups with opposing ideologies, a strategy described internally as “triangulating the truth.” “Seek multiple sources,” an editor told new fact-checkers during a training session. “If you can’t get an independent source on something, go to a conservative and go to a liberal and see where they overlap.” Such “triangulation” is not a matter of artificial balance, the editor argued: the point is to make a decisive ruling by forcing these experts to “focus on the facts.” As noted earlier, fact-checkers cannot claim expertise in the complex areas of public policy their work touches on. But they are confident in their ability to choose the right experts and to distill useful information from political arguments.

Bertrand Russell on expert consensus

This one is a wonderful quote from Bertrand Russell, and it comes via an article in the most recent Foreign Affairs called “How America Lost Faith in Experts,” by Tom Nichols, a professor at the U.S. Naval War College. Here’s the Russell quote:

The skepticism that I advocate amounts only to this: (1) that when the experts are agreed, the opposite opinion cannot be held to be certain; (2) that when they are not agreed, no opinion can be regarded as certain by a non-expert; and (3) that when they all hold that no sufficient grounds for a positive opinion exist, the ordinary man would do well to suspend his judgment… These propositions may seem mild, yet, if accepted, they would absolutely revolutionize human life.

Me at Nieman Lab: Hacking Consensus

For the past few years I’ve bent quite a few ears about how much better arguments could be online. The earliest of these ear-bendings (that I can remember) was in Q1 of 2008. Since then I’ve talked to policy wonks, developers, journalists and plenty of friends and family about how I think the basic op-ed model should be improved. Four years since that first conversation, I finally have an essay on the topic that I feel comfortable standing behind.

My piece is up at Harvard’s Nieman Journalism Lab and I hope you’ll give it a read. I’d appreciate any feedback you have. Here’s the intro:

In a recent New York Times column, Paul Krugman argued that we should impose a tax on financial transactions, citing the need to reduce budget deficits, the dubious value of much financial trading, and the literature on economic growth. So should we? Assuming for a moment that you’re not deeply versed in financial economics, on what basis can you evaluate this argument? You can ask yourself whether you trust Krugman. Perhaps you can call to mind other articles you’ve seen that mentioned the need to cut the deficit or questioned the value of Wall Street trading. But without independent knowledge — and with no external links — evaluating the strength of Krugman’s argument is quite difficult.

It doesn’t have to be. The Internet makes it possible for readers to research what they read more easily than ever before, provided they have both the time and the ability to filter reliable sources from unreliable ones. But why not make it even easier for them? By re-imagining the way arguments are presented, journalism can provide content that is dramatically more useful than the standard op-ed, or even than the various “debate” formats employed at places like the Times or The Economist.

To do so, publishers should experiment in three directions: acknowledging the structure of the argument in the presentation of the content; aggregating evidence for and against each claim; and providing a credible assessment of each claim’s reliability. If all this sounds elaborate, bear in mind that each of these steps is already being taken by a variety of entrepreneurial organizations and individuals.

Please read the rest!