The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.
“Just for being false, that doesn’t violate the community standards.” CNN’s Oliver Darcy asked John Hegeman, the head of Facebook News Feed, why InfoWars is allowed to maintain a page (with over 900,000 followers) when it is notorious for spreading fake news and conspiracy theories like that 9/11 was an inside job and that the Sandy Hook shooting never happened.
Hegeman said that the company does not “take down false news.”
“I guess just for being false that doesn’t violate the community standards,” Hegeman said, explaining that InfoWars has “not violated something that would result in them being taken down.”
Hegeman added, “I think part of the fundamental thing here is that we created Facebook to be a place where different people can have a voice. And different publishers have very different points of view.”
This thread, by BuzzFeed’s Davey Alba, has more from the same event. Facebook seems okay downranking these sources in its algorithm. But it doesn’t seem to want an actual human to make the decision to go ahead and ban them altogether.
Next Q: FB said it would “Reduce” & not “remove” false news—if something has been debunked, why not remove? SU: I think the best approach is to use ranking and try to encourage better behavior. To use “reduce” and then “inform.”
— Davey Alba (@daveyalba) July 12, 2018
.@oliverdarcy: I cant understand how FB can say we’re committed to fighting fake news but that @RealAlexJones @infowars can profit off your platform.
SU: …It bugs me too. We need to define that in a clear, fair way & figure out how our policies apply. We have a long way to go.
— Davey Alba (@daveyalba) July 12, 2018
HEGEMAN 2/2: …I understand you probably disagree with aspects of what we're doing here. But I think that's probably the most important thing to focus on (in terms of impact in the world); how many people are seeing things & how much distribution it's getting.
— Davey Alba (@daveyalba) July 12, 2018
when you trigger infowars pic.twitter.com/NW8n0MRlYY
— Oliver Darcy (@oliverdarcy) July 12, 2018
Facebook followed up on Twitter:
Instead, we demote individual posts etc. that are reported by FB users and rated as false by fact checkers. This means they lose around 80% of any future views. We also demote Pages and domains that repeatedly share false news.
— Facebook (@facebook) July 12, 2018
Through Social Science One, researchers can get access to Facebook data. Social Science One, launched Wednesday, is an independent research commission that will give social scientists access to previously private Facebook data. The initiative, announced back in April, is funded by outside organizations, and the research won’t be subject to Facebook’s approval. Robbie Gonzalez reported in Wired:"if recategorizing it as fake caused it to lose 99% of its views, would that also be fine because not a ban?" then repeat question, adding 9s
— Liam Liwanag Burke (@liamlburke) July 12, 2018
Starting today, researchers from around the world can apply for funding and data access that Social Science One will approve — not Facebook. If researchers want to search for something in the platform’s data that could make it look bad — or if they actually find something — Facebook won’t be able to pump the brakes.
Academics can now submit proposals to study information/misinformation, based on a dataset of public URLs Facebook users globally have clicked on, when, and by what types of people, including many links judged to be intentionally false news stories by third party fact checkers.
— Social Science One (@SocSciOne) July 11, 2018
To track opportunities and find out more:
All information will be posted at SocialScience.One, with notifications on Twitter and Facebook. Researchers can also sign up for our mailing list. Social Science One will release regular RFI’s (requests for information) and RFP’s (requests for proposals). Formal proposals will all be submitted through Social Science Research Council (SSRC). Proposals will be accepted on a rolling basis, with reviews scheduled periodically. Detailed codebooks for the available datasets to analyze will be available at SocialScience.One. Over time, we will add new types of datasets, and most existing data sets will grow as more data come in.
WhatsApp wants to enable more research about misinformation, too (but won’t give data). The Facebook-owned WhatsApp will fund research into the spread of misinformation on the platform. “The program will make unrestricted awards of up to $50,000 per research proposal,” and grantees will be invited (travel and lodging paid) to two workshops. Applications close August 12, 2018, at 11:59 pm PT. Unlike with Social Science One, however, “no WhatsApp data will be provided to award recipients.”
FB providing awards for research related to WhatsApp. No data provided though. Focus is on topics that intersect with the platform, including disinfo and elections (using data from eg interviews, surveys, participant observation, etc). https://t.co/jTnWk2Lu3m
— Kelly Born (@KellyKborn) July 7, 2018
Speaking of WhatsApp, this week it launched a feature that indicates when a message has been forwarded; the changes, the Financial Times reports, come after “a spate of lynchings in India that were alleged to have been sparked by false WhatsApp rumors.” WhatsApp is also running fake news warnings in Indian newspapers, including information about the new forwarding feature. India is WhatsApp’s largest market, with 200 million users.
“A bit more effort might go a long way.” New research from Gord Pennycook and David Rand suggests that susceptibility to fake news is driven less by strong partisanship and more just by lazy/non-analytical thinking. This is maybe a good thing, if laziness is an easier problem to target (which, is it?) Anyway, the research included 3,446 participants on MTurk. Pennycook and Rand write:
Individuals who are more willing to think analytically when given a set of reasoning problems (i.e., two versions of the Cognitive Reflection Test) are less likely to erroneously think that fake news is accurate. Crucially, this was not driven by a general skepticism toward news media: More analytic individuals were, if anything, more likely to think that legitimate (“real”) news was accurate. All of the real news stories that we used — unlike the fake ones — were factually accurate and came from mainstream sources. Thus, our evidence indicates that analytic thinking helps to accurately discern the truth in the context of news headlines. More analytic individuals were also better able to discern real from fake news regardless of their political ideology, and of whether the headline was Pro-Democrat, Pro-Republican, or politically neutral; and this relationship was robust to controlling for age, gender, and education
They found some differences based on political ideology:
The overall capacity to discern real from fake news was lower among those who preferred Donald Trump over Hillary Clinton, relative to those who preferred Hillary Clinton over Donald Trump (the one exception being that in Study 2, those who preferred Trump were better at discerning Republican-consistent items)…
The present results indicate that there is, in fact, a political asymmetry when it comes to the capacity to discern the truth in news media. Moreover, the association between conservatism and media truth discernment held independently of CRT performance. This may help explain why Republican-consistent fake news was apparently more common than Democrat-consistent fake news leading up to the 2016 Presidential election (Allcott & Gentzkow, 2017; Guess, Nyhan, & Reifler, 2018) and why the media ecosystem (including open web links, and both Twitter and Facebook sharing) is more polarized on the political right than on the left in the U.S. (Faris et al., 2017). Nonetheless, it remains unclear precisely why Republicans (at least in Mechanical Turk samples) are apparently worse at discerning between fake and real news.
Planning your 2018 travel? Here’s a calendar of digital disinformation–related events!