Nieman Foundation at Harvard
Last Night at School Committee distills hours-long public meetings into half-hour podcast episodes
ABOUT                    SUBSCRIBE
June 21, 2019, 8 a.m.
Audience & Social

“First-generation fact-checking” is no longer good enough. Here’s what comes next

Plus: Updates from GlobalFact 6 and The Verge’s Facebook content moderation expose No. 2.

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

“Fact checkers need to move from ‘publish and pray’ to ‘publish and act.'” “The idea that fact checking can work by correcting the public’s inaccurate beliefs on a mass scale alone doesn’t stack up,” write representatives from Full Fact (U.K.), Africa Check (Africa), and Chequeado (Argentina), in a manifesto of sorts published Thursday to all three sites.

“First-generation fact-checking” — the approach of simply publishing fact-checks, which sites like do — is a worthy effort, the authors write, but it isn’t enough if you actually want to change people’s minds. “Nobody should be surprised when, despite fact checkers publishing lots of fact checks, people still believe inaccurate things and politicians still spin and distort. Fact checking can work but not if this is all we do.” Full Fact, Africa Check, and Chequeado argue instead for their second-generation approach that includes not just publishing but also pressure and working for system change:

First, we move from just publishing to “publish and act.” We seek corrections on the record, pressure people not to make the same mistake again, complain where possible to a standards body. In other words, we use whatever forms of moral, public, or where appropriate regulatory pressure are available to stop the spread of specific bits of misinformation.

Secondly, we recognize that our fact checking provides a unique evidence base that gives us important insight into where misleading claims come from in public life and how they are spread…

Thirdly, we work for system change. Using the evidence from our fact checks we identify patterns and common causes, points where we can intervene to significantly reduce particular kinds or sources of information. The pattern might be who’s publishing something, where it’s published, a particular subject that there’s a lot of false information about, or something else. The interventions can range from educating children or adults to advocating for policy changes.

Finally, culture. We are trying to create institutions in different societies that can help anchor public debate to reality and to challenge the casual acceptance of deceptive and misleading behavior. This is a long-term task: it involves earning good trusted reputations and not just getting attention. It needs funders to think long-term as well as fact checkers.

And, they acknowledge, they’re striving for a third generation of fact-checking that “will have to address all these issues — but also be able to function at internet scale, be massively collaborative, and work across international borders.”

The sixth-annual fact-checking summit, GlobalFact, took place this week in Cape Town. Here are a few highlight tweets:

“It’s a sweatshop in America.” The Verge’s Casey Newton published another horrifying story (here’s the first one) on working conditions for Facebook content moderators, this time at the Cognizant site in Tampa that is “Facebook’s worst-performing content moderation site in America.” The content moderators are contract workers, not Facebook employees.

The story outlines the gross working conditions that the content moderators face and shows them horribly affected by the videos they have to moderate, especially those that feature animal abuse. But I kept thinking about this part of the article:

In June 2018, a month into his job, Facebook began seeing a rash of videos that purportedly depicted organs being harvested from children. (It did not.) So many graphic videos were reported that they could not be contained in Speagle’s queue.

“I was getting the brunt of it, but it was leaking into everything else,” Speagle said. “It was mass panic. All the SMEs had to rush in there and try to help people. They were freaking out — they couldn’t handle it. People were crying, breaking down, throwing up. It was like one of those horror movies. Nobody’s prepared to see a little girl have her organs taken out while she’s still alive and screaming.” Moderators were told they had to watch at least 15 to 30 seconds of each video.

The debunk of the organ harvesting was an update added to the story after publication: “This article has been updated to reflect the fact that a video that purportedly depicted organ harvesting was determined to be false and misleading.” These videos have, in fact, been debunked multiple times. That makes them no less horrifying to watch since they do actually show children receiving medical care after airstrikes. To the moderators, they felt real. What I don’t understand is why moderators were seeing so much content that had been debunked already, presumably including by Facebook’s own fact-checkers, but, seemingly, not being educated on the fact that it was fake, which might have made the moderation of it less scarring. If the education part isn’t happening, why not?

I don’t question the moderators’ reliability — I think this just goes to show, again, that this work is hard and mentally scarring and shouldn’t be foisted off on contract employees without proper training. But that’s the whole point of this article!

The report came out the same day that Facebook announced it’s launching a global cryptocurrency.

Illustration from L.M. Glackens’ The Yellow Press (1910) via The Public Domain Review.

POSTED     June 21, 2019, 8 a.m.
SEE MORE ON Audience & Social
Show tags
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Last Night at School Committee distills hours-long public meetings into half-hour podcast episodes
“We have created this podcast as an easy way for any parent, citizen, or interested party to get the highlights, and our take, on what happened last night at School Committee.”
How Seen’s mobile journalism reaches 7 million people across platforms
“Three years ago, I would have said that every platform is super different from the others. Now they’ve all become quite similar.”
Seeing stories of kindness may counteract the negative effects of consuming bad news
“This shows us there’s something unique about kindness which may buffer the effects of negative news on our mental health.”