The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.
Heads up Wikipedia, YouTube’s coming. YouTube will add “information cues” — i.e., Wikipedia article links — to some of its videos, YouTube CEO Susan Wojcicki said at SXSW this week. Wired’s Louise Matsakis explains: “If you search and click on a conspiracy theory video about, say, chemtrails, YouTube will now link to a Wikipedia page that debunks the hoax alongside the video. A video calling into question whether humans have ever landed on the moon might be accompanied by the official Wikipedia page about the Apollo Moon landing in 1969. Wojcicki says the feature will only include conspiracy theories right now that have “significant debate” on the platform.”
It turns out, though, that YouTube didn’t actually tell Wikipedia about its plans before announcing them…
The @Wikimedia Foundation statement about the recent @YouTube announcement pic.twitter.com/PFDDNtNNjn
— Wikimedia (@Wikimedia) March 14, 2018
While we are thrilled to see people recognize the value of @Wikipedia’s non-commercial, volunteer model, we know the community’s work is already monetized without the commensurate or in-kind support that is critical to our sustainability. https://t.co/d8TdTTPdgp
— Katherine Maher (@krmaher) March 14, 2018
Wojcicki, who recently suggested that Facebook stick to baby pictures despite her own platform being overrun with crisis actor harassment videos, seems to be punting here…ironically in much the same way FB did when it tried 3rd party flags on disputed & hoax content.
— Renee DiResta (@noUpside) March 14, 2018
Semi-related: What’s with tech companies acting like flat-earthers are semi-charming and possibly-correct-who-knows?
Wojcicki then engages on flat earth truthers.
Me paraphrasing what she said: Most people think the earth is round, some people think it's flat. That's the way it is. Why should we decide what people think?
— Ryan Mac (@RMac18) March 13, 2018
how did platforms decide that their go-to example of the non-judge-ability of "truth" and "fact" should be FLAT EARTHERS@laurahazardowen and I were wondering
— Shan Wang ☃ (@shansquared) March 15, 2018
Thread:
doesn't speak enormously well of the advertising-based business model of the internet that one of its premiere information-services corporations (market cap $760b) relies on a donation-funded, volunteer-edited nonprofit to provide accurate information https://t.co/qtUq1j9rKZ
— Max Read (@max_read) March 13, 2018
Meanwhile, Zeynep Tufekci wrote for The New York Times this past week on YouTube as radicalizing agent. Drawing on recent investigations by The Wall Street Journal and others, she writes, “YouTube’s tendency toward the incendiary seems evident…”
In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted about this by the health department and concerned citizens, the restaurant managers reply that they are merely serving us what we want.
This situation is especially dangerous given how many people — especially young people — turn to YouTube for information. Google’s cheap and sturdy Chromebook laptops, which now make up more than 50 percent of the pre-college laptop education market in the United States, typically come loaded with ready access to YouTube.
The European Commission released its report on combatting disinformation. Earlier this year, the EC assembled a group of experts to come up with recommendations on fighting disinformation and fake news in the EU. The group’s report was published this week. The recommendations fall into the usual categories (enhanced transparency, media literacy, more research, etc.) The report’s authors list their six key points here. A few other bits I thought were interesting:
— Don’t over-regulate. “The [High Level Expert Group] believes the best responses are likely to be those driven by multi-stakeholder collaborations, minimize legal regulatory interventions, and avoid the politically dictated privatization of the policing and censorship of what is and is not acceptable forms of expression.” Rasmus Kleis Nielsen, director of research at Reuters Institute for the Study of Journalism at Oxford and a member of the expert group that produced this report, has more on this here.— Platforms should give up some data, enough to allow “independent inquiries, audits and research into activities reliant on proprietary media and data infrastructures with a view to ensuring transparency and authenticity of inform.”
— Fact-checking groups should find ways to work together across Europe. “As fact-checking activities in the EU are still relatively fragmented3, more work can and should be done by fact-checkers, verification organizations, and professional newsrooms in a collaborative manner within EU Member States and across the EU to exploit the untapped potential of cross-border and cross-sector cooperation and improve their working methods through the adoption of state-of-the-art technologies. Existing partnerships with platforms should be expanded across Europe with a clear roadmap for data sharing with academics that will allow for better understanding of disinformation strategies and their dynamics.” The report suggests creating “European Centres for interdisciplinary and independent evidence-based research on problems of disinformation.”
— There’s a need to “think more strategically about how media literacy is implemented across Europe…with clear methods of evaluation and cross-country comparison.” The authors even suggest including information and media literacy in the OECD’s Program for International Student Assessment, which every three years tests 15-year-olds on science, mathematics, reading, collaborative problem solving, and financial literacy.
Meanwhile, what’s already happening around the world? Poynter’s Daniel Funke has a guide to how countries around the world are looking to stem the flow of online misinformation. Lots of proposed laws and draft bills so far. Poynter will be updating the list on an ongoing basis.
Pinterest: Not exempt! “If you only use Pinterest for finding recipes and interior design ideas, mis/disinformation may never cross your home feed,” writes Amy Collier, associate provost for digital learning at Middlebury College and head of Middlebury’s Office of Digital Learning and Inquiry. “But if you’ve searched for any information on contested topics like vaccinations, gun control, or climate change, you probably have seen mis/disinformation in action.” She looks at fake/spam Pinterest accounts that use polarizing political pins and pins that spread misinformation as a way of drawing attention to their affiliate link posts.
More clarification on that Science paper. The “fake news spreads faster than real news” paper that I wrote about last week has been the subject of continued Twitter debate. Coauthor Deb Roy offered a diagram of the actual scope of the paper, compared to the much broader scope as interpreted in much of the paper’s coverage. But this probably isn’t as clear as it should have been — not just in the coverage of the paper, but in the paper itself.
Thanks to @jjaron for some clarifying comments to my first diagram. Here's a second one that fills in some more detail. The outer circles point to areas where we need further research! pic.twitter.com/KhJth1zFnp
— Deb Roy (@dkroy) March 15, 2018