Nieman Foundation at Harvard
HOME
          
LATEST STORY
A big week for tech blowback: Regulation, broken promises, and Facebook victimhood
ABOUT                    SUBSCRIBE
July 28, 2017, 9:51 a.m.
Audience & Social

“Stories may have political impact less by persuading than by reminding people which side they are on”

Plus: “Humans can be successfully manipulated through social bots,” what Russia Today’s fact checking project actually does, and a more sociological take on the spread of fake news.

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

“Humans can be successfully manipulated through social bots.” Chengcheng Shao, Giovanni Luca Ciampaglia, and others at Indiana University, Bloomington, analyzed 14 million tweets spreading 400,000 claims during and following the U.S. presidential campaign and election and found that “accounts that actively spread misinformation are significantly more likely to be bots.” Also, “humans do most of the retweeting, and they retweet claims posted by bots as much as by other humans. This suggests that humans can be successfully manipulated through social bots.” The paper offers a couple ideas on reducing bot activity; here’s one:

An alternative strategy would be to employ CAPTCHAs challenge response tests to determine whether a user is human…Their use to limit automatic posting or resharing of news links could stem bot abuse, but also add undesirable friction to benign applications of automation by legitimate entities, such as news media and emergency response coordinators.

It’s not just Twitter and Facebook! Not surprisingly, WhatsApp is a conduit for fake news too — at least in Kenya ahead of its upcoming general election, writes Abdi Latif Dahir for Quartz.

Analysts have labeled the spread of information on these messaging apps as ‘dark social,’ given that their effect cannot be measured or questioned publicly. Government officials in Kenya are also closely watching chatter on these apps, recently accusing the managers of 21 WhatsApp groups of spreading hate.

The Atlantic’s Alexis Madrigal coined the term “dark social” in 2012 to refer to traffic that includes no referral data and thus is “essentially invisible to most analytics programs.” It’s meant, for instance, links shared via email, IM, and so on, but now we can add messaging apps like WhatsApp to the category. I looked around for more information on dark social and fake news and haven’t yet been able to find much, but here’s one story by BuzzFeed’s Pranav Dixit by from January: “Viral WhatsApp hoaxes are India’s own fake news crisis.” And in March, we told you about WhatsApp misinformation in Colombia and one effort to push back.

“The media are a strategic asset, just like oil and gas.” Dana Priest looks at “Lessons from the Europe’s fight against Russian disinformation” for The New Yorker.

In most of Europe, where hoax news stories and Web sites with bogus articles are muddying the digital pipeline of reliable information, political leaders have publicly reaffirmed their faith in the mainstream media and urged them to do a better job exposing imposters. With the help of journalists and researchers, the European Union’s East Stratcom Task Force has published thousands of examples of false or twisted stories in its weekly Disinformation Review, available in eighteen languages.

Can Russia Today’s fact checking be taken seriously? Poynter looked at RT’s “FakeCheck.” Four months in, it’s only fact checked 16 stories, most of which had to do with “Russia’s image abroad or its foreign policy.” But, Alexios Mantzarlis and Anastasia Valeeva write, the selection bias isn’t the biggest problem: “The bigger problem is that it mixes dubious fact checks among the legitimate ones, leading to unproven or poorly sourced conclusions.” For instance, “Rumors about alleged Russian meddling in the Maltese elections were addressed by referring to the Russian Embassy’s statement on the matter. Allegations that Wikileaks had ties to Russia were ‘debunked’ by pointing to a quote by Julian Assange. In both these case the evidence comes from self-interested sources.”

How stories become true. Professor of sociology Francesca Polletta and Jessica Callahan at the University of California Irvine look at how “the rise of right-wing media outlets and the profusion of user-shared digital news” have changed storytelling (paywall), asking the question, “How was a story of middle-class whites pushed aside by a parade of minority groups, abandoned by the government, and treated with disdain by liberals made real?” They write:

Conservative media commentators often styled a personal relationship with the viewer or listener, in which allusive stories reinforced the bond between speaker and audience. The growth of user-shared digital “news” stories also worked to reinforce bonds of political partisanship. However, here, what was important was a style of storytelling. By sharing, liking, and commenting on outrageous stories — and by determinedly not questioning their factual accuracy — people signaled that they were savvy, scrappy, and clearly on one side of the partisan divide…

We miss the fact that people often interpret outrageous stories as evidence of a broader phenomenon; that stories about the way the world used to be often conflate history and nostalgia; that people’s relationship to media commentators affects what they take from the stories they hear; and that stories may have political impact less by persuading than by reminding people which side they are on.

Illustration from L.M. Glackens’ The Yellow Press (1910) via The Public Domain Review.

POSTED     July 28, 2017, 9:51 a.m.
SEE MORE ON Audience & Social
PART OF A SERIES     Real News About Fake News
SHARE THIS STORY
   
Show comments  
Show tags
 
Join the 45,000 who get the freshest future-of-journalism news in our daily email.
A big week for tech blowback: Regulation, broken promises, and Facebook victimhood
Among many weeks of bad press for the big tech companies, this week stands out.
The Honest Ads Act would force Internet companies to change their disclosure practices by January 2018
Plus: A former Russian troll speaks out; a definition of disinformation; Wikitribune’s preferred news sources.
From Nieman Reports: The powers and perils of news personalization
News personalization could help publishers attract and retain audiences — in the process making political polarization even worse.