Nieman Foundation at Harvard
The enduring allure of conspiracies
ABOUT                    SUBSCRIBE
March 23, 2018, 9:43 a.m.
Audience & Social

Okay, if Facebook and Google aren’t publishers: How about editors?

Plus: Facebook found (and shut down) a Macedonian disinformation effort in the Alabama special election, and Facebook groups could get garbage-y fast.

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

Oh, Facebook. The big story right now is obviously Cambridge Analytica (we have roundups of the news here and here), but in an interview all about that with The New York Times this week, Facebook CEO Mark Zuckerberg also said a few things about fake news (or false news, as Facebook prefers to call it). First:

Take things like false news. You know, a lot of it is really spam, if you think about it. It’s the same people who might have been sending you Viagra emails in the ’90s, now they’re trying to come up with sensational content and push it into Facebook and other apps in order to get you to click on it and see ads. There are some pretty basic policy decisions we’ve made, like O.K., if you’re anywhere close to being a fake news site, you can’t put Facebook ads on your site, right? So then suddenly, it becomes harder for them to make money. If you make it hard enough for them to make money, they just kind of go and do something else.


One of the things that gives me confidence is that we’ve seen a number of elections at this point where this has gone a lot better. In the months after the 2016 election, there was the French election. The new A.I. tools we built after the 2016 elections found, I think, more than 30,000 fake accounts that we believe were linked to Russian sources who were trying to do the same kind of tactics they did in the U.S. in the 2016 election. We were able to disable them and prevent that from happening on a large scale in France.

In last year, in 2017 with the special election in Alabama, we deployed some new A.I. tools to identify fake accounts and false news, and we found a significant number of Macedonian accounts that were trying to spread false news, and were able to eliminate those. And that, actually, is something I haven’t talked about publicly before, so you’re the first people I’m telling about that.

Craig Silverman, Jane Lytvynenko, and Lam Thuy Vo wrote this week for BuzzFeed about sketchiness in Facebook Groups. Facebook is pushing Groups hard as a way of promoting “meaningful interactions,” and while it seems as if the Groups experience for most users is positive (I LOVE my local Facebook moms’ group, it’s the main reason I’m not quitting Facebook), groups seem to be as subject to spammers, hackers, and trolls as everything else on Facebook is — and will likely get worse as Facebook makes them a bigger part of its strategy.

“A binary signal of whether a source is trusted or not is absurd.” Campbell Brown, Facebook’s head of news partnerships, debated Richard Gingras, Google’s senior director of news and social products, at the Financial Times’ Future of News event Thursday in New York. I now have the actual transcript of their conversation (note: the transcript was provided to me by Facebook) so just sticking that here:

Campbell Brown: I don’t think we can take such a hands off approach. I do think this is a fantastic place where the platforms can collaborate with publishers, with people like Emily [Bell], to try to identify signals that we can all have consensus on. There are some obvious ones — does a news organization have a corrections policy when they make a mistake? Who is the news organization, who works there, their bios, how long have they been around?

Matthew Garahan (FT): Some sort of accreditation system?

Brown: I think we are moving in that direction. I don’t know where we’ll land [Richard talking over].

Richard Gingras: From a First Amendment perspective we don’t want anyone accrediting who a journalist is, but to that point, for instance, one thing that we did is work with the industry on the Trust Project, which is how do organizations themselves do a better job of presenting who they are, what they’re about, their expertise, do they have those correction policies… [continues]

Brown: Having been a journalist for more than 20 years, I don’t think anyone can call themselves a journalist. I don’t think we should say, anyone who says I’m a journalist …

Gingras: I would be very reluctant to live in a society where, like, who decides who decides who a journalist is. But my point is, I want to raise a good example here. Trinity Mirror in the UK went and implemented all of the recommendations of the Trust Project…


Brown: That’s exactly what I’m talking about, there are signals, and the Trust Project is working on them. And that’s exactly what I’m talking about, is I do think this is an area where potentially we can collaborate and we should collaborate, as opposed to everybody working in their own little bubble to try to figure this out. This is a problem that’s going to impact us all, as I said before, given the direction where fake news is going, it’s going to impact publishers, it’s going to impact newsrooms, it’s going to affect platforms, and if we don’t work together, we’re not going to have a lot less success in addressing it.

Illustration from L.M. Glackens’ The Yellow Press (1910) via The Public Domain Review.

POSTED     March 23, 2018, 9:43 a.m.
SEE MORE ON Audience & Social
PART OF A SERIES     Real News About Fake News
Join the 50,000 who get the freshest future-of-journalism news in our daily email.
The enduring allure of conspiracies
Conspiracy theories seem to meet psychological needs and can be almost impossible to eradicate. One remedy: Keep them from taking root in the first place.
With Out-of-Pocket, Nikhil Krishnan wants to make the healthcare industry funnier — and easier to understand
“It doesn’t lend itself to a lot of different types of jokes but I’m so in the deep Reddit that at this point, the sadboi existential crisis jokes just come naturally.”
Yes, deepfakes can make people believe in misinformation — but no more than less-hyped ways of lying
The reasons we get fooled by political lies are less about the technology behind their production and more about the mental processes that lead us to trust or mistrust, accept or discount, embrace or ignore.