Nieman Foundation at Harvard
HOME
          
LATEST STORY
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
ABOUT                    SUBSCRIBE
May 9, 2016, 1:17 p.m.
Audience & Social
LINK: gizmodo.com  ➚   |   Posted by: Ricardo Bilton   |   May 9, 2016

Despite its claims, Facebook has never been a completely neutral platform. Even the supposedly objective News Feed algorithm is a product of the human biases that helped create it.

The same, it seems, goes for Facebook’s trending topics, which are far from the objective reflection of popularity that Facebook claims they are, according to a new report from Gizmodo. Not only did trending topics writers inject non-trending stories into the widget, but they regularly suppressed trending stories from conservative news sites in favor of “more neutral” stories from mainstream sites, unidentified sources told Gizmodo, which wrote about Facebook’s relationship with its news curators last week.

“It was absolutely bias. We were doing it subjectively,” a former curator said. (Gizmodo’s Michael Nunez interviewed several former Facebook news curators who worked for the site between 2014 and 2015. All of them chose to remain anonymous.) Still, the Gizmodo article cautions:

Stories covered by conservative outlets (like Breitbart, Washington Examiner, and Newsmax) that were trending enough to be picked up by Facebook’s algorithm were excluded unless mainstream sites like the New York Times, the BBC, and CNN covered the same stories.

Other former curators interviewed by Gizmodo denied consciously suppressing conservative news, and we were unable to determine if left-wing news topics or sources were similarly suppressed. The conservative curator described the omissions as a function of his colleagues’ judgments; there is no evidence that Facebook management mandated or was even aware of any political bias at work.

Facebook curators also said they were discouraged from posting trending stories about Facebook itself, often having to go through several layers of management to get posts approved. Facebook, in that respect, isn’t too different from some major news organizations. Bloomberg News, for example, has a policy against covering parent company Bloomberg LP and its founder.

Reactions to the Gizmodo story have been mixed. While some it just provides more evidence of the dangers of a single massive platform having outsize control over access to information, others don’t see why it’s shocking that Facebook isn’t entirely neutral.

There’s also the question of how much responsibility Facebook has to both promote stories that it thinks that people should read and suppress stories that spread misinformation, as it’s already done with with hoaxes that go viral. In that respect, the biases of human editors might be a feature, not a bug. News organizations regularly make similar editorial judgments. As Facebook becomes an increasingly central news source for its users, there’s a lot more pressure for it to act like a legitimate one.

The problem: while all publishers constantly exercise that kind of editorial judgement, there’s considerably less comfort about Facebook doing the same.

Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
“While there is even more need for this intervention than when we began the project, the initiative needs more resources than the current team can provide.”
Is the Texas Tribune an example or an exception? A conversation with Evan Smith about earned income
“I think risk aversion is the thing that’s killing our business right now.”
The California Journalism Preservation Act would do more harm than good. Here’s how the state might better help news
“If there are resources to be put to work, we must ask where those resources should come from, who should receive them, and on what basis they should be distributed.”