Nieman Foundation at Harvard
HOME
          
LATEST STORY
Seeking “innovative,” “stable,” and “interested”: How The Markup and CalMatters matched up
ABOUT                    SUBSCRIBE
Sept. 29, 2017, 9:20 a.m.
Audience & Social

“Platforms for all ideas.” Russian misinformation for all swing states. Obituaries for all fake news writers.

Also: Mark Zuckerberg has regrets, and the Russians’ role as cultural flame-fanners.

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

Fake news writers get LONG obituaries now. Last week, disinfo bros; this week, Paul Horner, who appears to have OD’d on prescription drugs. Here’s The New York Times, The Arizona Republic (double byline, multiple sections including a close look at his charity to hand out socks to homeless people), NPR, CBS, NBC, BuzzFeed News, Gizmodo. The Washington Post has a slightly different take: “Who do you believe when a famous Internet hoaxer is said to be dead?”

Why is “was a jackass online” justification for so many extensive obituaries? Horner’s quote to The Washington Post last November, “I think Trump is in the White House because of me,” was repeated in each obituary I found, despite the fact that there is zero evidence to back that claim (and it would be impossible to prove anyway). Each is illustrated with the same CNN screenshot.

David Uberti writes at Splinter:

The undercurrent flowing through such stories — you can also read this between the lines of coverage of Milo, Breitbart, and the like — is that these figures are messengers of the new-media counterculture. Edgy, even.

My counterargument: They are bad people doing bad things, and they represent a cancer within the manic-depressive media environment we all inhabit. If the journalists who write about these people can’t make their own moral judgments about how terrible they are for all the rest of us, we’re in even more trouble than we think.

There was more fake news in swing states (and a lot of fake news on Twitter). Researchers from Oxford’s Computational Propaganda Project looked at the location data of political news being shared on Twitter in the 10-day period around the 2016 election. They started out looking at 22,117,221 “tweets collected between November 1-11, 2016, that contained hashtags related to politics and the election in the U.S.” (The Computational Propaganda Project’s previous work has also looked at tweets that include hashtags.) About a third of those contained enough location information to determine the state the user was in. “Many of the swing states getting highly concentrated doses of polarizing content were also among those with large numbers of votes in the Electoral College,” they write.

Of the tweets being shared: “20% of all the links being shared with election-related hashtags came from professional news organizations.”

The number of links to professionally produced content is less than the number of links to polarizing and conspiratorial junk news. In other words, the number of links to Russian news stories, unverified or irrelevant links to WikiLeaks pages, or junk news was greater than the number of links to professional researched and published news. Indeed, the proportion of misinformation was twice that of the content from experts and the candidates themselves.

Second, a worryingly large proportion of all the successfully catalogued content provides links to polarizing content from Russian, WikiLeaks, and junk news sources. This content uses divisive and inflammatory rhetoric, and presents faulty reasoning or misleading information to manipulate the reader’s understanding of public issues and feed conspiracy theories. Thus, when links to Russian content and unverified WikiLeaks stories are added to the volume of junk news, fully 32% of all the successfully catalogued political content was polarizing, conspiracy driven, and of an untrustworthy provenance.

The New York Times’ Daisuke Wakabayashi and Scott Shane reported this week that Twitter will testify before Congress about its role in the election. The article includes information from the Alliance for Securing Democracy, which is housed out of the German Marshall Fund of the United States in D.C. The Times notes that researchers at the Alliance for Securing Democracy:

…have been publicly tracking 600 Twitter accounts — human users and suspected bots alike — they have linked to Russian influence operations. Those were the accounts pushing the opposing messages on the N.F.L. and the national anthem.

Of 80 news stories promoted last week by those accounts, more than 25 percent ‘had a primary theme of anti-Americanism,’ the researchers found. About 15 percent were critical of Hillary Clinton, falsely accusing her of funding left-wing antifa — short for anti-fascist — protesters, tying her to the lethal terrorist attack in Benghazi, Libya, in 2012 and discussing her daughter Chelsea’s use of Twitter. Eleven percent focused on wiretapping in the federal investigation into Paul Manafort, President Trump’s former campaign chairman, with most of them treated the news as a vindication for President Trump’s earlier wiretapping claims.

“What we see over and over again is that a lot of the messaging isn’t about politics, a specific politician, or political parties,” Laura Rosenberger, the director of the Alliance for Securing Democracy, told the Times. “It’s about creating societal division, identifying divisive issues and fanning the flames.”

Mark Zuckerberg and “both sides.” President Barack Obama spoke with Facebook CEO Mark Zuckerberg a couple months before the election and “made a personal appeal to Zuckerberg to take the threat of fake news and political disinformation seriously,” The Washington Post’s Adam Entous, Elizabeth Dwoskin, and Craig Timberg reported. On Wednesday, Zuckerberg wrote in a Facebook post: “After the election, I made a comment that I thought the idea misinformation on Facebook changed the outcome of the election was a crazy idea. Calling that crazy was dismissive and I regret it. This is too important an issue to be dismissive.”

But he also framed it as an issue with two sides: “Trump says Facebook is against him. Liberals say we helped Trump. Both sides are upset about ideas and content they don’t like. That’s what running a platform for all ideas looks like.”

Umm, this is starting to sound a little disinfobro-ish. (“We’re all now questioning reality as it’s being handed down.”)

Read her whole thread, but…

Illustration from L.M. Glackens’ The Yellow Press (1910) via The Public Domain Review.

Laura Hazard Owen is the editor of Nieman Lab. You can reach her via email (laura_owen@harvard.edu) or Twitter DM (@laurahazardowen).
POSTED     Sept. 29, 2017, 9:20 a.m.
SEE MORE ON Audience & Social
PART OF A SERIES     Real News About Fake News
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Seeking “innovative,” “stable,” and “interested”: How The Markup and CalMatters matched up
Nonprofit news has seen an uptick in mergers, acquisitions, and other consolidations. CalMatters CEO Neil Chase still says “I don’t think we’ve seen enough yet.”
“Objectivity” in journalism is a tricky concept. What could replace it?
“For a long time, ‘objectivity’ packaged together many important ideas about truth and trust. American journalism has disowned that brand without offering a replacement.”
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
Within days of visiting the pages — and without commenting on, liking, or following any of the material — Facebook’s algorithm recommended reams of other AI-generated content.