Nieman Foundation at Harvard
HOME
          
LATEST STORY
A year in, The Guardian’s European edition contributes 15% of the publisher’s pageviews
ABOUT                    SUBSCRIBE
March 2, 2018, 6:44 a.m.
Audience & Social

Disinformation spread online is so disorienting that it’s messing with the researchers who study it

Plus: Outrage-tweeting is a dangerous thing, and why we have to teach students not to be “trust misers.”

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

“Very effective and extremely disorienting.” This week I got to hear Kate Starbird, assistant professor at the University of Washington and director of its Emerging Capacities of Mass Participation (emCOMP) Laboratory, speak about her research into how online disinformation spreads during crisis events (like shootings and terrorist attacks) and what she’s learned about the networks spreading this information and the tactics that they use.

A few of the intriguing bits from Starbird’s talk:

— She and her team have looked a lot at the language that conspiracy theorists use both in tweets and on sites like 21stCenturyWire.com. This is “question-mark language,” Starbird said. “‘I’m not gonna tell you what to think, I’m just gonna put the evidence out there and you can make up your mind yourself’ — this way of talking persists across different events” from Sandy Hook to the Boston Marathon bombing to the Orlando shooting.

— Starbird spent a lot of the time reading the sites that were spreading these conspiracy theory posts — the sites behind the links being tweeted out. (“I do not recommend this.) Stuff she looked at: Homepages, about pages, ownership, authors, common themes and stories. She developed coding schemes for theme, political view, and so on. Common themes: “aliens, anti-big pharma, chemtrails, anti-corporate media, geo-engineering, George Soros, anti-globalist, anti-GMO, Flat Earth, Illuminati, Koch Brothers, anti-media, 9-11 truth, New World Order Cabal, nutritional supplements, pedophile rings, Rothschilds, anti-vaccine, anti-Zionist.” (On the subject of GMOs, by the way, please read this tweet thread, which is not about conspiracy theories but is really interesting to keep in mind as you read about Starbird’s work.)

“When you do this kind of thing, you should keep a rope around your ankle and have someone to pull you back up,” Starbird said. “This is a really disorienting part of the Internet.”

Starbird attempted to categorize the political information on the sites. She’d imagined that the sites would break down along conservative vs. liberal lines; “that is not what I found at all.” Instead, she found that the concepts were used flexibly to connect people from the left and right. As she wrote in a 2017 Medium post describing this research:

It quickly became clear that the U.S. left (liberal) vs. right (conservative) political spectrum was not appropriate for much of this content. Instead, the major political orientation was towards anti-globalism. Almost always, this orientation was made explicit in the content.

The meaning of globalism varied across the sites. For some websites focused on a U.S. audience, globalism implied a pro-immigrant stance. For more internationally-focused sites, globalism was used to characterize (and criticize) the influence of the U.S. government in other parts of the world. In some of the more conspiracy-focused sites, the term was used to suggest connections to a global conspiracy by rich, powerful people who manipulated the world for their benefit. Globalism was also tied to corporatism — in other words, the ways in which large, multi-national companies exert power over the world. And the term was also connected, implicitly and explicitly, to mainstream media.

— Starbird spoke about the effects — on her, her research partners, and students — of reading so much of this kind of content. It’s “very effective and extremely disorienting,” she said. “We’ve had trouble even reporting on this data because we’re so confused about what’s going on and it’s so hard to make sense of things.” It’s scary stuff. (Speaking of scary stuff, read my interview with Jonathan Albright from this week if you haven’t; he mentioned Starbird as one of his favorite researchers, by the way.) “This way of thinking, once you get into it, is very sticky. Once you’re down there, it’s so hard to get out, and it’s hard to think of solutions.”

“Outrage-spreading.” Molly McKew writes in Wired about how liberals contributed to the spread of a Parkland shooting conspiracy theory, “in some cases far more than the supporters of the story…algorithms — apparently absent the necessary “sentiment sensitivity” that is needed to tell the context of a piece of content and assess whether it is being shared positively or negatively — see all that noise the same.”

“There’s not an option not to trust anyone.” Mike Caulfield of the Digital Polarization Initiative (he was also featured in last week’s column!) has a great post about how part of any good media literacy program means teaching students that they do need to trust something. Quoting liberally:

There’s not an option to not trust anyone, at least not an option that is socially viable. And societies without trust come to bad ends. Students are various, of course, but what I find with many students is they are trust misers — they don’t want to spend their trust anywhere, and they think many things are equally untrustworthy. And somehow they have been trained to think this makes them smarter than the average bear.

A couple stories will illustrate the problem. I was once working with a bunch of students and comparing Natural News (a health supplements site which specializes in junk science claims) and the Mayo Clinic, one of the most respected outfits out there. OK, I say, so what’s the problem with taking advice from Natural News?

Well, says a student, they make their money selling supplements, and so they have an incentive to talk down traditional medicine.

I beam like a proud papa. Good analysis!

“And,” the student continues, “the Mayo Clinic is the same way. They make money off of patients so they want to portray regular hospitals as working.”

Houston, we have a problem.[…]

I’ve referred to this before as trust compression, the tendency for students to view vastly different levels of credibility of sources all as moderately or severely compromised. Breitbart is funded by the Mercers, who are using it directly to influence political debate, but the Washington Post is also owned by Jeff Bezos who donated to Democrats. So it’s a wash. And yes, we have the word of an expert in a subject where she has multiple cites against the word of a lobbying group but neither one is perfect really. Everyone’s got an agenda, nobody knows everything, and there’s not 100% agreement on anything anyway.

In a follow-up thread, Caulfield talked about techniques for teaching “trust decompression.”

Illustration from L.M. Glackens’ The Yellow Press (1910) via The Public Domain Review.

Laura Hazard Owen is the editor of Nieman Lab. You can reach her via email (laura_owen@harvard.edu) or Twitter DM (@laurahazardowen).
POSTED     March 2, 2018, 6:44 a.m.
SEE MORE ON Audience & Social
PART OF A SERIES     Real News About Fake News
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
A year in, The Guardian’s European edition contributes 15% of the publisher’s pageviews
After the launch of Guardian Europe, one-time donations from European readers increased by 45%.
Press Forward awards $20 million to 205 small local newsrooms
In response to the volume and quality of applications, Press Forward doubled the funding and number of grantees for this open call.
Midwestern news nonprofit The Beacon shuts down its Wichita newsroom
“We’ve realized that we can’t do it all, and have made the decision to no longer have a staffed newsroom in Wichita.”