Nieman Foundation at Harvard
HOME
          
LATEST STORY
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
ABOUT                    SUBSCRIBE
June 26, 2020, 8:15 a.m.
Audience & Social

The little things — pop-ups, notifications, warnings — work to fight fake news, new evidence shows

Plus: A look at COVID-19 misinformation in Black online communities, and how conservative media may have made the pandemic worse.

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

“Relatively short, scalable interventions could be effective in fighting misinformation around the world.” In 2017, Facebook released a set of “Tips to spot false news.” Developed in collaboration with First Draft, the tips were “promoted at the top of users’ news feeds in 14 countries in April 2017 and printed in full-page newspaper advertisements in the United States, the United Kingdom, France, Germany, Mexico, and India,” write the authors of a study published this week in PNAS. “A variant of these tips was later distributed by WhatsApp (a Facebook subsidiary) in advertisements published in Indian and Pakistani newspapers in 2018. These tips are therefore almost surely the most widely disseminated digital media literacy intervention conducted to date.”

The researchers tested the effectiveness of these tips on audiences in the U.S. and India — and found that they worked.

Strikingly, our results indicate that exposure to variants of the Facebook media literacy intervention reduces people’s belief in false headlines. These effects are not only an artifact of greater skepticism toward all information — although the perceived accuracy of mainstream news headlines slightly decreased, exposure to the intervention widened the gap in perceived accuracy between mainstream and false news headlines overall. In the United States, the effects of the treatment were particularly strong and remained statistically measurable after a delay of approximately 3 weeks. These findings suggest that efforts to promote digital media literacy can improve people’s ability to distinguish between false and mainstream news content, a result with important implications for both scientific research into why people believe misinformation online and policies designed to address the problem.

“A brief intervention which could be inexpensively disseminated at scale can be effective at reducing the perceived accuracy of false news stories,” the authors conclude, “helping users more accurately gauge the credibility of news content they encounter on different topics or issues.”

Consumer Reports’ Kaveh Waddell (he’s an investigative reporter at the Consumer Reports Digital Lab, which launched last year and which I’m looking forward to reading more from) points out that Facebook itself could surely shed further light on the these research findings: “The company should know how many people clicked on the media literacy list, how long they spent on that page, whether they later changed their reading or sharing habits, and how long any effects lasted.” But it’s not sharing. “These scholars did an amazing job of looking at the scale of the intervention with the tools they had available, but I’m just so disappointed that there isn’t a way for an independent audit of what happened on the platform,” First Draft’s Claire Wardle told Waddell.

On the topic of brief interventions, Facebook is taking a cue from The Guardian and will show a warning if users try to share a story that’s more than 90 days old. (If they still want to share it after that, they can.) Other types of notifications may be coming, too. From Facebook’s John Hegeman, VP of feed and stories:

Over the past several months, our internal research found that the timeliness of an article is an important piece of context that helps people decide what to read, trust and share. News publishers in particular have expressed concerns about older stories being shared on social media as current news, which can misconstrue the state of current events. Some news publishers have already taken steps to address this on their own websites by prominently labeling older articles to prevent outdated news from being used in misleading ways.

Over the next few months, we will also test other uses of notification screens. For posts with links mentioning COVID-19, we are exploring using a similar notification screen that provides information about the source of the link and directs people to the COVID-19 Information Center for authoritative health information. Through providing more context, our goal is to make it easier for people to identify content that’s timely, reliable and most valuable to them.

(OK, now do it for Trump’s posts.)

“A symptom of an information ecosystem poisoned by racial inequality.” The Shorenstein Center has a report on COVID-19 misinformation in Black online communities in the United States — an especially crucial topic since Black people are disproportionately affected by the coronavirus, dying of it at a higher rate than White people. Brandi Collins-Dexter identified four main strands of misinformation circulating — some organic, some “targeted directly at the community by outsiders.”

1. Black people could not die from COVID-19
2. The virus was man-made for the purposes of population control
3. The virus could be contained through use of herbal remedies
4. 5G radiation was the root cause of COVID-19

“Our research makes clear that the health misinformation surrounding COVID-19 poses an immediate threat to the health of Black people, and is a symptom of an information ecosystem poisoned by racial inequality,” Collins-Dexter writes.

While there is much to be learned about COVID-19 and how it works, it is clear that misinformation and conspiratorial frames that suggest that Black people are somehow inoculated from the disease are both dangerous and patently untrue. Black lives are consistently put in danger, and it is incumbent upon community actors, media, government, and tech companies alike to do their part to ensure that timely, local, relevant, and redundant public health messages are served to all communities.

“Consuming far-right media and social media content was strongly associated with low concern about the virus at the onset of the pandemic.” The Washington Post’s Christopher Ingraham has a very useful, detailed roundup of three recent studies focused on “conservative media’s role in fostering confusion about the seriousness of the coronavirus. Taken together, they paint a picture of a media ecosystem that amplifies misinformation, entertains conspiracy theories and discourages audiences from taking concrete steps to protect themselves and others.”

Danger sign photo by svantassel used under a Creative Commons license.

Laura Hazard Owen is the editor of Nieman Lab. You can reach her via email (laura_owen@harvard.edu) or Twitter DM (@laurahazardowen).
POSTED     June 26, 2020, 8:15 a.m.
SEE MORE ON Audience & Social
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
“While there is even more need for this intervention than when we began the project, the initiative needs more resources than the current team can provide.”
Is the Texas Tribune an example or an exception? A conversation with Evan Smith about earned income
“I think risk aversion is the thing that’s killing our business right now.”
The California Journalism Preservation Act would do more harm than good. Here’s how the state might better help news
“If there are resources to be put to work, we must ask where those resources should come from, who should receive them, and on what basis they should be distributed.”