Nieman Foundation at Harvard
HOME
          
LATEST STORY
Newsweek is making generative AI a fixture in its newsroom
ABOUT                    SUBSCRIBE
March 9, 2018, 9:01 a.m.
Audience & Social

Fear, surprise, disgust: Fake news spreads faster than some real news on Twitter

Plus: A big overview of all the research that we have so far.

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

“It’s easier to be novel and surprising when you’re not bound by reality.” It’s not bots. It’s us. A paper published on Thursday in Science (it’s the cover story) by MIT’s Soroush Vosoughi, Deb Roy, and Sinan Aral tracks the spread of fake and real news tweets and finds that fake news both reached more people than the truth and spread faster than the truth — BUT there are caveats about the “true” news here: It was mostly news that had been fact-checked by outlets like Snopes and PolitiFact, not some of the legit-crazy real stuff that’s been in the headlines of the nation’s largest papers recently.

The researchers looked at 126,000 “rumor cascades” spread by about 3 million people.

A rumor cascade begins on Twitter when a user makes an assertion about a topic in a tweet, which could include written text, photos, or links to articles online. Others then propagate the rumor by retweeting it. A rumor’s diffusion process can be characterized as having one or more cascades, which we define as instances of a rumor-spreading pattern that exhibit an unbroken retweet chain with a common, singular origin.

Note that not all of the “rumor cascades” were fake; they were a mix of true, false, and partially true stories: “We sampled all rumor cascades investigated by six independent fact-checking organizations (snopes.com, politifact.com, factcheck.org, truthorfiction.com, hoax-slayer.com, and urbanlegends.about.com) by parsing the title, body, and verdict (true, false, or mixed) of each rumor investigation reported on their websites and automatically collecting the cascades corresponding to those rumors on Twitter. The result was a sample of rumor cascades whose veracity had been agreed on by these organizations between 95 and 98 percent.”

The true news was — again — primarily stuff that had been fact-checked; here, for instance, are the “Rating: True” stories on Snopes. Many of them are downright boring and not something that most people would bother tweeting about. (“Was Ex-California State Senator and Gun Control Advocate Leland Yee Arrested for Gun Trafficking?” Yes.) In order to help generalize results beyond fact-checked stories, however, the researchers had three college students analyze a second sample of more than 13,000 rumor cascades that hadn’t been verified by any fact-checking organization. “When we compared the diffusion dynamics of the true and false rumors that the annotators agreed on, we found results nearly identical to those estimated with our main data set,” the authors write.

The breathless coverage of the study itself (“False news 70 percent more likely to spread on Twitter,” “Fake news is 70% more likely to be shared on Twitter than true stories, ‘stunned’ MIT researchers find,” “It’s true: False news spreads faster and wider. And humans are to blame”) soon came under scrutiny:

In The Atlantic, Robinson Meyer elaborated:

Some political scientists also questioned the study’s definition of “news.” By turning to the fact-checking sites, the study blurs together a wide range of false information: outright lies, urban legends, hoaxes, spoofs, falsehoods, and “fake news.” It does not just look at fake news by itself — that is, articles or videos that look like news content, and which appear to have gone through a journalistic process, but which are actually made up.

Therefore, the study may undercount “non-contested news”: accurate news that is widely understood to be true. For many years, the most retweeted post in Twitter’s history celebrated Obama’s re-election as president. But as his victory was not a widely disputed fact, Snopes and other fact-checking sites never confirmed it.

The study also elides content and news. “All our audience research suggests a vast majority of users see news as clearly distinct from content more broadly,” [Rasmus Kleis] Nielsen, the Oxford professor, said in an email. “Saying that untrue content, including rumors, spread faster than true statements on Twitter is a bit different from saying false news and true news spread at different rates.”

But many researchers told me that simply understanding why false rumors travel so far, so fast, was as important as knowing that they do so in the first place.

And so, with these don’t-freak-out caveats:

Falsehoods “diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information,” the researchers write. “Whereas the truth rarely diffused to more than 1000 people, the top 1% of false-news cascades routinely diffused to between 1000 and 100,000 people,” and many more people retweeted fake than true news. Fake political news “traveled deeper and more broadly, reached more people, and was more viral than any other category of false information. False political news also diffused deeper more quickly and reached more than 20,000 people nearly three times faster than all other types of false news reached 10,000 people.”

Why did the false rumor cascades spread faster? The team looked at a few hypotheses. Turns out A) it’s not bots and B) it’s not because the people who spread fake news have more followers. From a write-up of the paper by Science’s Katie Langin:

At first the researchers thought that bots might be responsible, so they used sophisticated bot-detection technology to remove social media shares generated by bots. But the results didn’t change: False news still spread at roughly the same rate and to the same number of people. By default, that meant that human beings were responsible for the virality of false news.

That got the scientists thinking about the people involved. It occurred to them that Twitter users who spread false news might have more followers. But that turned out to be a dead end: Those people had fewer followers, not more.

Finally the team decided to look more closely at the tweets themselves. As it turned out, tweets containing false information were more novel—they contained new information that a Twitter user hadn’t seen before — than those containing true information. And they elicited different emotional reactions, with people expressing greater surprise and disgust. That novelty and emotional charge seem to be what’s generating more retweets.

“It’s easier to be novel and surprising when you’re not bound by reality,” coauthor Roy said told Scientific American’s Larry Greenemeier.

The paper’s attracting tons of praise in my people-who-study-fake-news Twitter list, and coauthor Aral, who is the David Austin professor of management at MIT, wrote it up for this week’s New York Times Sunday Review section. “Though it was disheartening to learn that humans are more responsible for the spread of false stories than previously thought,” he writes, “this finding also implies that behavioral interventions may succeed in stemming the tide of falsity.”

And while you’re reading Science… The issue also includes “The science of fake news,” an overview by a veritable Who’s Who of academics who study this stuff (deep breath: David Lazer, Matthew Baum, Yochai Benkler, Adam Berinsky, Kelly Greenhill, Filippo Menczer, Miriam Metzger, Brendan Nyhan, Gordon Pennycook, David Rothschild, Michael Schudson, Steven Sloman, Cass Sunstein, Emily Thorson, Duncan Watts, Jonathan Zittrain). This article presents something of a summary of existing fake news research (with links to papers), lists the questions that remain, and suggests some possible interventions and solutions. One of their ideas (hint, hint):

We urge the platforms to collaborate with independent academics on evaluating the scope of the fake news issue and the design and effectiveness of interventions. There is little research focused on fake news and no comprehensive data-collection system to provide a dynamic understanding of how pervasive systems of fake news provision are evolving. It is impossible to recreate the Google of 2010. Google itself could not do so even if it had the underlying code, because the patterns emerge from a complex interaction among code, content, and users. However, it is possible to record what the Google of 2018 is doing. More generally, researchers need to conduct a rigorous, ongoing audit of how the major platforms filter information.

Along those lines, note that Twitter this week posted an opening for a new position, director of social science, whose duties will include “act as liaison between the broader research community and Twitter” and helping to “identify, support, and develop partnerships with external researchers.”

This isn’t enough academics, I want more. The Exploring Media Ecosystems conference was held at MIT this week, and you might want to take a scroll through the #emeMIT hashtag. Some highlights (and, yes, this includes links to two lists of even more research into fake news):

Illustration from L.M. Glackens’ The Yellow Press (1910) via The Public Domain Review.

Laura Hazard Owen is the editor of Nieman Lab. You can reach her via email (laura_owen@harvard.edu) or Twitter DM (@laurahazardowen).
POSTED     March 9, 2018, 9:01 a.m.
SEE MORE ON Audience & Social
PART OF A SERIES     Real News About Fake News
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Newsweek is making generative AI a fixture in its newsroom
The legacy publication is leaning on AI for video production, a new breaking news team, and first drafts of some stories.
Rumble Strip creator Erica Heilman on making independent audio and asking people about class
“I only make unimportant things now, but it’s all the unimportant things that really make up our lives.”
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
“While there is even more need for this intervention than when we began the project, the initiative needs more resources than the current team can provide.”