Nieman Foundation at Harvard
HOME
          
LATEST STORY
Newsonomics: Can The Correspondent “unbreak news” in the United States?
ABOUT                    SUBSCRIBE
Feb. 6, 2018, 3:48 p.m.
Audience & Social

Crowdsourcing trusted news sources can work — but not the way Facebook says it’ll do it

A new study finds asking Facebook users about publishers could “be quite effective in decreasing the amount of misinformation and disinformation circulating on social media” — but Facebook will need to make one important change to its plan.

On January 19, Facebook CEO Mark Zuckerberg explained how the company planned to decide which news sources it would prioritize in the (now with less news!) News Feed. “We decided that having the community determine which sources are broadly trusted would be most objective,” he wrote.

This plan — in which users would be asked two questions: Whether they were familiar with a given news source, and how much they trusted it — was largely greeted with skepticism. “A reporter emailed me, like, ‘Hey, what do you think about this?’ and I started with a list of all the reasons that it seemed like a pretty terrible idea,” said David Rand, associate professor of psychology at Yale and the coauthor, with Gordon Pennycook, of much of the most-discussed research into fake news that’s been released in the past year or so. “Then I realized this is actually an empirical question. Why don’t we find out?”

In a working paper released Tuesday, Rand and Pennycook write that a plan somewhat like Facebook’s could actually work surprisingly well. “Algorithmically disfavoring news sources with low crowdsourced trustworthiness ratings may — if implemented correctly — be quite effective in decreasing the amount of misinformation and disinformation online,” they write.

But note that big “if.” One of their main findings was that crowdsourced trustworthiness ratings are actually much less effective if they exclude the ratings from people who are unfamiliar with a given site. Which is what Facebook plans to do.

“We ask people if they’re familiar with a publisher because, if they’re not, it’s not even worth asking if they trust the publisher,” Adam Mosseri, Facebook’s VP of News Feed, tweeted last month. “We eliminate any data about trust from people who don’t recognize a source.”

In fact, Rand and Pennycook find that “a lack of familiarity is an important cue for untrustworthiness.” In other words, the fact that someone hasn’t heard of fake news site worldnewsdailyreport.com makes them trust it less. Excluding ratings from participants who aren’t familiar with a given news source — in other words, only counting those who say they are familiar with worldnewsdailyreport.com — “dramatically reduces the difference between mainstream media sources and hyperpartisan or fake news sites.”

Here’s how the study worked: Rand and Pennycook used Mechanical Turk to survey about 1,000 U.S. residents if they were familiar with, and then how much they trusted (the wording was taken from the actual survey that Facebook has proposed using), selections from a list of 60 sites: 20 mainstream media sites, 20 “hyper-partisan” sites (like Breitbart, Daily Kos, and Infowars), and 20 fake news sites (categorized as such “due to the presence of false headlines from those sites on snopes.com,” including news4ktla.com, newsexaminer.net, and so on). Participants were then given the Cognitive Reflection Test (the results of that part of the survey aren’t discussed in this paper) and took a demographic questionnaire including questions about political preference.

Rand and Pennycook found “clear partisan differences in trust” — but the partisan differences were “much smaller than the overall difference between the mainstream media sources and the less reputable websites.” Overall, the crowdsourced ratings of a news outlet’s trustworthiness did “an excellent job of differentiating between reputable and non-reputable sources” — as long as the results weren’t filtered based on a respondent’s previous experience with a site.

Here’s what happens, though, when that filter — which, again, Facebook says it will use — is applied: Real, hyperpartisan, and fake news sites all become grouped closer together in terms of user trust:

“Crowdsourced ratings of outlet trustworthiness do not do a particularly good job of differentiating between reputable and non-reputable sources if the ratings of unfamiliar participants are excluded,” Rand and Pennycook write. “Unfamiliarity is an important cue of untrustworthiness…insofar as crowdsourced trust ratings are aimed at differentiating mainstream media outlets from hyper-partisan or fake news outlets, our data strongly suggests that excluding unfamiliar results is a mistake.” (A hopeful sign: Rand said that people at Facebook are reading his and Pennycook’s work. “I think the fact that they got rid of the ‘disputed by third-party fact checker’ tags is evidence that they are paying attention to the research going on,” he told me.)

These findings fit with an “initially skeptical” rather than “initially agnostic” model of media trust: They suggest that people initially assume that a news source they haven’t heard of before is untrustworthy. “The default is to not trust,” Rand told me. As someone becomes more familiar with a news outlet, they may change their mind. “The evidence we had for this going in was that we see there are a lot of outlets that are both low familiarity and low trust; there are a lot that are high familiarity and high trust; and there are some that are high familiarity and low trust.”

“It’s not that just automatically being familiar with something makes you trust it,” Rand added. “If you look at Breitbart or Infowars relative to the other hyperpartisan sites, their familiarity ratings are higher than their trust ratings. People know more about them; it doesn’t make them trust them more.” (By the way, looking at the study’s full list of familiarity and trust ratings for all 60 sources: PBS received the highest trust ratings from both Democrats and Republicans. After that, for Republicans, the #2 most-trusted outlet was Fox News; for Democrats, it was The New York Times. When Democrats and Republicans were combined, the news organizations with the highest trust ratings were: 1. PBS; 2. the Times; 3. NBC and 4. CBS (tied), and 5. The Washington Post.)

This is potentially problematic for “highly rigorous news sources that are less well-known (or that are new),” the authors note, since using a crowdsourced approach that takes unfamiliarity into account, these will have trouble gaining prominence. “If you leave in all the [familiarity] ratings, it’ll be pretty good at keeping bad sites out, but it’s also going to keep good sites out,” Rand told me. “If you have the opposite [and exclude ratings from people who are unfamiliar with the source they are rating], you’re going to let in a lot of bad sites, but also let in good sites.”

One possible solution would be for Facebook to make the judgment call about how much it wants to keep out bad sites versus letting in good sites; it could decide how much weight to put on the unfamiliar judgments. A better solution, Rand said, “would be developing some approach where, for content sources that are unfamiliar, [Facebook] shows people sample content before they rate it.” He proposed a system in which sample articles would be picked at random from a news site, with each user seeing a different sample of article headlines before rating the source. “That would be way slower and a much more time- and money-intensive process for Facebook, but it’s potentially worth it,” Rand said. In fact, he thinks it should probably be done for all the news outlets Facebook asks people to rank, since hyperpartisan and fake sites have names that sound like the names of real news sites. Still, “it’s a serious challenge,” he acknowledged.

There were, certainly, differences in the ways that Democrats and Republicans ranked content. Those who leaned Democrat trusted every mainstream media source except for Fox News more than Republicans did. And Republicans were more likely to trust hyperpartisan and fake news sites — which “may be explained by Republican-leaning individuals being more likely to trust ‘alternative’ sources of news in general,” Pennycook and Rand note in the paper. Ultimately, though, the positive indication of this work is that “judgments about the trustworthiness of news sources is another case where accuracy trumps in-group favoritism.”

“My overall feeling about this is that the motivated reasoning, partisan bias stuff that gets so much attention these days is just a little overdone,” Rand told me. “It’s there. There’s no question. But it’s like a second-order thing. It’s smaller than people’s ability to know what is a real news outlet or a real news story.”

Photo by Alberto G. used under a Creative Commons license.

POSTED     Feb. 6, 2018, 3:48 p.m.
SEE MORE ON Audience & Social
SHARE THIS STORY
   
 
Join the 50,000 who get the freshest future-of-journalism news in our daily email.
Newsonomics: Can The Correspondent “unbreak news” in the United States?
Ad-free, member-funded, and Dutch: The team behind the breakout success De Correspondent is translating its ideas into English (and Judd Apatow is on board).
25 newsrooms have attempted to bridge divisions — in person. Here’s what they’ve learned
“Whenever you have an individual interaction, a lot of the bluster, a lot of the generalizations, a lot of the group identifications fall away,” one participant in Pennsylvania said.
So some people will pay for a subscription to a news site. How about two? Three?
New York magazine and Quartz both now want readers to pay up. How deep into their pockets will even dedicated news consumers go for a second (or third or fourth) read?