Nieman Foundation at Harvard
Postcards and laundromat visits: The Texas Tribune audience team experiments with IRL distribution
ABOUT                    SUBSCRIBE
June 3, 2013, 2:12 p.m.
LINK:  ➚   |   Posted by: Joshua Benton   |   June 3, 2013

Here’s a fascinating new NBER working paper that has implications for news media. It focuses on a well known phenomenon: People with strong partisan opinions on politics are more likely to believe the facts back up their perspective. Those on the right are more likely to believe incorrect “facts” that put left-leaners in a bad light, and vice versa:

For example, Republicans are more likely than Democrats to say that the deficit rose during the Clinton administration; Democrats are more likely to say that inflation rose under Reagan.

In the real world, there’s little penalty for getting this sort of a question wrong. It’s unlikely anyone will correct you, and even if they do, it’s very rare that someone will face any significant consequences — even a public shaming — for incorrectly blaming the other side for something they didn’t do.

This new paper, by researchers at Yale and UC San Diego, tests that idea by upping the ante. Let’s say, when asking people these questions, you offered small payments to people for being correct. Would that change their behavior and make their statements better align with the facts?

The answer seems to be yes:

The experiments show that small payments for correct and “don’t know” responses sharply diminish the gap between Democrats and Republicans in responses to “partisan” factual questions. The results suggest that the apparent differences in factual beliefs between members of different parties may be more illusory than real. [emphasis mine]

The paper itself goes into much more detail. One piece of the experiment involved giving people the option to answer “don’t know” to a question where they might have partisan interest in one answer. If a “don’t know” response generated a financial reward — even though it was a smaller one than for getting the answer correct — roughly half of the responses became “don’t know.”

This pattern — frequent “don’t know” responses when subjects are paid to say “don’t know,” even though they are also offered more for correct responses — implies that participants are sufficiently uncertain about the truth that they expect to earn more by selecting “don’t know”

This great willingness to select “don’t know” has important implications for our understanding of partisan divergence. In particular, participants who offer “don’t know” responses behave in a manner that is consistent with this hypothesis: they know that their responses are otherwise partisan and that they don’t know the truth. In the absence of incentives for “don’t know” responses, they would offer insincere partisan responses, even if paid for correct ones, because they are both uninformed about the truth and aware of their ignorance.

This opens up a host of questions for journalism’s growing fact-checking industry. The idea that people who promulgate inaccurate ideas know they’re inaccurate — or at least know they don’t know the truth — fits into a larger set of evidence that merely presenting “the facts” doesn’t always lead to a more informed audience. “Facts” become more grist for the mill of identity creation — I’m a Democrat, so I say bad things about Republicans, facts be damned, or vice versa.

It also provides backing to Brendan Nyhan’s ideas about potential “backfire” in fact-checking — that what journalists perceive as a neutral recounting of reality can in fact be perceived as raising the stakes of a partisan battle and can engender a hardening of incorrect beliefs. Here’s Joe Keohane writing about the subject in 2010:

Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study.

But dangling a buck in front of partisans seems to go a long way toward letting people do just that.

Paying people to acknowledge their ignorance doesn’t seem like a workable large-scale strategy. But it does open up a range of ideas about how we might create incentive structures that somehow reward people to acknowledge when they don’t know what they’re talking about. And it also means that at least some of the polarization we’ve seen in American politics in recent decades might be only skin-deep. From the paper:

We find that small financial inducements for correct responses can substantially reduce partisan divergence, and that these reductions are even larger when inducements are also provided for “don’t know” answers. In light of these results, survey responses that indicate partisan polarization with respect to factual matters should not be taken at face value. Researchers and general analysts of public opinion should consider the possibility that the appearance of polarization is to a great extent an artifact of survey measurement rather than evidence of real differences in beliefs.

Show tags
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Postcards and laundromat visits: The Texas Tribune audience team experiments with IRL distribution
As social platforms falter for news, a number of nonprofit outlets are rethinking distribution for impact and in-person engagement.
Radio Ambulante launches its own record label as a home for its podcast’s original music
“So much of podcast music is background, feels like filler sometimes, but with our composers, it never is.”
How uncritical news coverage feeds the AI hype machine
“The coverage tends to be led by industry sources and often takes claims about what the technology can and can’t do, and might be able to do in the future, at face value in ways that contribute to the hype cycle.”