Nieman Foundation at Harvard
HOME
          
LATEST STORY
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
ABOUT                    SUBSCRIBE
May 11, 2022, 12:49 p.m.
Reporting & Production

Do browser extensions keep anyone away from fake news sites? Maybe a tiny bit

A new study finds that NewsGuard’s credibility ratings for news sites helped steer the most frequent consumers of misinformation towards more reliable outlets.

As more companies and platforms adopt ways to figure out whether fact-checking, flagging questionable content, or some other form of alert works best to dissuade people from consuming misinformation, a new study finds that credibility ratings for news sites may offer a tiny ray of hope — if users actually use them.

Conducted by researchers at New York University’s Center for Social Media and Politics using credibility ratings by the news rating website NewsGuard, the study looked at data from more than 3,300 volunteers who were recruited to be surveyed about their news consumption habits, with a subset of around 970 volunteers recruited to have their online news consumption monitored. The study was conducted over two different 2-week periods between May 2020 and July 2020. The findings were published last week in Science Advances.

Overall, the study found that when people installed the NewsGuard extension — which, when installed, provides people with a “green” or “red” rating for a site, with green indicating a trustworthy site — their tendency to focus on largely reliable news sites didn’t really change over the study duration. The volunteers were asked to install the NewsGuard extension for the study.

For context, sites like CNN and The New York Times have earned “green” from NewsGuard. The Gateway Pundit or DailyKos are rated “red.” (Fox News? Green.) For users who have NewsGuard’s extension installed, these ratings show up as shields embedded directly into users’ search results, social media feeds or in websites that users visit (although it’s unclear how many typical news consumers are actually going to install credibility extensions before reading the news).

Although the study didn’t delve into specific topics, the researchers did assess respondents’ tendency to believe in misinformation by posing 10 statements to them, five each about the Black Lives Matter movement and the Covid-19 pandemic. For instance, one of the questions volunteers were asked was whether BLM protesters were paid to attend those events (false) and whether Covid-19 is spread by 5G cell technology (also false).

Overall, three of each set of five statements were false and two were true. The researchers found that the intervention — i.e. installing NewsGuard extensions — also did not change the measure of who believed in misinformation at the end of the study. (The authors didn’t look at the smaller group of the most frequent consumers of misinformation here, either).

Other measures — such as trust in institutions, belief that fake news is a problem in general, and belief that fake news is a problem in the mainstream media — did also not change significantly with the intervention of installing NewsGuard extensions.

Going into the study, “We expected optimistically, that [the intervention] would have a positive effect,” said Kevin Aslett, a post-doctoral associate in computational social science at NYU and lead author of the new study. They thought that being shown credibility ratings “would reduce misperceptions, it would reduce political cynicism, increase trust in media, and [have an impact on] all these downstream effects of exposure to misinformation … obviously that did not happen,” he said.

The authors measured news consumption by looking at the proportion of time spent on reliable versus unreliable sites, to be sure that those who looked at unreliable news sites actually did frequent them. In the overall sample, the time spent on reliable versus unreliable sites also didn’t change in the group that was asked to install the NewsGuard extension compared to the control group.

However, this lack of change was likely because the people enrolled in the study were by and large already consumers of reliable news sites. Roughly 65% of those in the study already had a news diet made up of “green”-rated sites.

The researchers did notice that the needle for reliable news consumption moved when they looked at the bottom 10% to 20% of people who tended to spend time on unreliable news sites. Among this group, the study found a shift toward more “green”-rated sites being more frequented sources of news by the end of the study.

Gordon Crovitz, co-CEO of NewsGuard (which was not involved in the study) said, “We’re pleased with the confirmation — as we’ve seen in a lot of other research — that if individuals have access to ratings, that they will value those ratings and will appreciate the added information about the source of news.” He asserted, “Nobody wants to rely on or share information from sources that repeatedly publish false content.”

At the same time, the researchers did not assess whether the time spent on “red”-rated sites in the smaller group of misinformation consumers changed because of the intervention. They worried that the small sample size at this level may not yield statistically meaningful results.

Still, “It’s really rare to find any lasting effects on people’s behavior the way we did here,” said Andrew Guess, an assistant professor of politics and public affairs at Princeton University and an author of the study. “For a very subtle intervention of that kind, I think that’s quite remarkable.”

Given the relatively small number of people who actually engage with misinformation and the relatively short duration of the study, the researchers are thinking of ways to further explore what they found in this study. “I think we want to try and find a more targeted sample,” said Guess, which would involve focusing on those people who are the heaviest consumers of misinformation, even if it’s not necessarily representative of the larger population. Going beyond the two-week study duration is also a future goal.

Finally, “A big challenge is finding interventions that actually have lasting effects,” Guess said. For a lot of interventions for combatting misinformation, effects are often fleeting, Guess explained, adding, “The jury’s still out on on what kinds of strategies can actually produce lasting changes.”

Photo of traffic light by Niels Sienaert used under a Creative Commons license.

POSTED     May 11, 2022, 12:49 p.m.
SEE MORE ON Reporting & Production
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
“While there is even more need for this intervention than when we began the project, the initiative needs more resources than the current team can provide.”
Is the Texas Tribune an example or an exception? A conversation with Evan Smith about earned income
“I think risk aversion is the thing that’s killing our business right now.”
The California Journalism Preservation Act would do more harm than good. Here’s how the state might better help news
“If there are resources to be put to work, we must ask where those resources should come from, who should receive them, and on what basis they should be distributed.”