Nieman Foundation at Harvard
Are you willing to pay for Prepare to be asked before year’s end
ABOUT                    SUBSCRIBE
May 31, 2019, 9:20 a.m.
Audience & Social

What do we do about the “shallowfake” Nancy Pelosi video and others like it?

Plus: A new raft of misinformation research to read, and the actual dollar cost of anti-vaxxing.

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

Facebook and the Pelosi video. A week ago, The Washington Post reported that altered videos (“shallowfakes”) of House Speaker Nancy Pelosi — slowed down to make it look as if she were drunk and slurring her words — were spreading on social media. Rudy Giuliani, Donald Trump’s personal attorney, tweeted one of them (though he later deleted the tweet). From the Post:

One version, posted by the conservative Facebook page Politics WatchDog, had been viewed more than 2 million times by Thursday night, been shared more than 45,000 times, and garnered 23,000 comments with users calling her “drunk” and “a babbling mess.”

YouTube took the videos down. Facebook said it would downrank them, but wouldn’t remove them altogether.

“We don’t have a policy that stipulates that the information you post on Facebook must be true,” Facebook said in a statement to The Washington Post.

The company said it instead would “heavily reduce” the video’s appearances in people’s news feeds, append a small informational box alongside the video linking to the two fact-check sites, and open a pop-up box linking to “additional reporting” whenever someone clicks to share the video.

Monika Bikert, Facebook’s head of product policy and counterterrorism, told CNN’s Anderson Cooper that Facebook’s policy is that “people make their own informed choice about what to believe. Our job is to make sure we’re getting them accurate information.” She claimed that “anybody who is seeing this video in their news feed, anybody who is going to share it to somebody else, anybody who has shared it in the past, they are being alerted that this video is false.”

“You’re making money by being in the news business,” Cooper said. “If you can’t do it well, shouldn’t you just get out of the news business?”

In the days that have followed, more people have criticized Facebook for not removing the video entirely. “I think they have proven — by not taking down something they know is false — that they were willing enablers of the Russian interference in our election,” Pelosi herself told KQED News this week. (That’s kind of a stretch.) Hillary Clinton described the video as “sexist trash,” adding, “Let’s send a message to Facebook that those who are in Facebook’s communities would really like Facebook to pay attention to false and doctored videos before we are flooded with them over the next months.”

“This latest doctored video proves that Facebook as we knew it is over,” Kara Swisher (who pressed Mark Zuckerberg last year on Facebook’s decision to downrank rather than remove Holocaust-denier posts) wrote in a New York Times op-ed:

The only thing the incident shows is how expert Facebook has become at blurring the lines between simple mistakes and deliberate deception, thereby abrogating its responsibility as the key distributor of news on the planet.

Would a broadcast network air this? Never. Would a newspaper publish it? Not without serious repercussions. Would a marketing campaign like this ever pass muster? False advertising.

No other media could get away with spreading anything like this because they lack the immunity protection that Facebook and other tech companies enjoy under Section 230 of the Communications Decency Act. Section 230 was intended to spur innovation and encourage start-ups. Now it’s a shield to protect behemoths from any sensible rules.

Ian Bogort in The Atlantic:

Cooper, the journalist, can’t understand why Bickert won’t see the video as propaganda posing as journalism. Bickert, the social-media executive, can’t grasp why Cooper is able to see the video only as journalism or propaganda, and not just as content, that great gray slurry that subsumes all other meaning…

When Facebook says it’s not a news company, it doesn’t just mean that it doesn’t want to fall under the legal and moral responsibilities of a news publisher. It also means that it doesn’t care about journalism in the way that newsmakers (and hopefully citizens) do, and that it doesn’t carry out its activities with the same goals in mind.

But there’s also an argument to be made that the matter isn’t that clearcut. Facebook’s statement, “We don’t have a policy that stipulates that the information you post on Facebook must be true,” may sound nuts until you remember that a blanket policy against untrue information would also affect stuff like Onion articles. “Some argue that the matter isn’t that clear-cut. Angela Chen in MIT Technology Review:

The video makes Pelosi look strange, but it’s not a “deepfake” — it doesn’t falsely show her saying anything she didn’t actually say. It’s an example of what MIT Media Lab researcher Hossein Derakhshan calls “malinformation,” or basically true information that has been subtly manipulated to harm someone.

And while Derakhshan thinks it might have been okay to take down a faked video that showed Pelosi making a racial slur, for example, he argues that it’s too much to expect platforms to police malinformation. “Usually when people do these manipulations, I don’t think it’s illegal,” he says. “So there’s no ground for removing them or requesting them to be taken off these platforms.”

Casey Newton suggests a new set of standards:

A policy that enables the maximum amount of political speech, save for a small number of exemptions outlined in a publicly posted document, has a logical coherence that “take down stuff I don’t like” does not.

That’s one reason why the take-it-down brigade might consider developing an alternate set of Facebook community standards for public consideration. I have no doubt that there are better ways to draw the boundaries here — to swiftly purge malicious propaganda, while promoting what is plainly art. But someone has to draw those boundaries, and defend them.

Alternatively, you could break Facebook up into its constituent parts, and let the resulting Baby Books experiment with standards of their own. Perhaps WhatsApp, stripped of all viral forwarding mechanics, would find a slowed-down Pelosi video acceptable when shared from one friend to another. Meanwhile Instagram would rapidly detect the video’s surging popularity and ensure that nothing like it appeared on the app’s Explore page, where the company could unwittingly aid in its distribution the way Facebook’s News Feed algorithm did this time around. Making communities smaller can make it easier to craft rules that fit them.

And the Times’ Farhad Manjoo says that the Pelosi video isn’t what we really need to be worrying about.

Whatever Facebook decides to do with this weird little video is a big meh, because if you were to rank the monsters of misinformation that American society now faces, amateurishly doctored viral videos would clock in as mere houseflies in our midst. Worry about them, sure, but not at the risk of overlooking a more clear and present danger, the million-pound, forked-tongue colossus that dominates our misinformation menagerie: Fox News and the far-flung, cross-platform lie machine that it commands.

To read! The International Communication Association’s 69th annual conference took place in Washington, DC this week. Here are some tweets with papers and research to follow up on (let’s call this my to-do list):

Anti-vaxxing’s cost in dollars. Wired’s Maryn McKenna takes a look:

Researchers at the CDC estimated that handling 107 cases of measles that occurred in 2011 cost state and local health departments between $2.7 million and $5.3 million. In 2014, 42 people came down with the disease after passing through Disneyland at the same time as a never-identified person with measles — and subsequently infected 90 additional people in California, 14 more in other states, and a further 159 people in Canada. The cost of controlling the outbreak, just in California, totaled almost $4 million. And in 2017, a five-month outbreak of measles in Minnesota infected 79 people and cost the state $2.3 million.

The funding to support that work isn’t being conjured out of the air. It’s coming from the budgets of public agencies, which have already been facing years of cuts and have no secret stashes of discretionary money to spend.

“There are substantial public health responses that go into mitigating an outbreak, and we should pursue those, because they prevent larger outbreaks or broader social disruption,” says Saad Omer, a physician and epidemiologist at Emory University and the senior author of a recent paper on the “true cost” of measles outbreaks. “But it does result in a lot of costs that can be pretty substantial. And we don’t measure the further indirect costs to the community.”

Illustration from L.M. Glackens’ The Yellow Press (1910) via The Public Domain Review.

Laura Hazard Owen is the editor of Nieman Lab. You can reach her via email ( or Twitter DM (@laurahazardowen).
POSTED     May 31, 2019, 9:20 a.m.
SEE MORE ON Audience & Social
PART OF A SERIES     Real News About Fake News
Show tags
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Are you willing to pay for Prepare to be asked before year’s end
The cable news network plans to launch a new subscription product — details TBD — by the end of 2024. Will Mark Thompson repeat his New York Times success, or is CNN too different a brand to get people spending?
Errol Morris on whether you should be afraid of generative AI in documentaries
“Our task is to get back to the real world, to the extent that it is recoverable.”
In the world’s tech capital, Gazetteer SF is staying off platforms to produce good local journalism
“Thank goodness that the mandate will never be to look what’s getting the most Twitter likes.”