Nieman Foundation at Harvard
HOME
          
LATEST STORY
Newsonomics: It’s looking like Gannett will be acquired by GateHouse — creating a newspaper megachain like the U.S. has never seen
ABOUT                    SUBSCRIBE
May 9, 2016, 1:17 p.m.
Audience & Social
LINK: gizmodo.com  ➚   |   Posted by: Ricardo Bilton   |   May 9, 2016

Despite its claims, Facebook has never been a completely neutral platform. Even the supposedly objective News Feed algorithm is a product of the human biases that helped create it.

The same, it seems, goes for Facebook’s trending topics, which are far from the objective reflection of popularity that Facebook claims they are, according to a new report from Gizmodo. Not only did trending topics writers inject non-trending stories into the widget, but they regularly suppressed trending stories from conservative news sites in favor of “more neutral” stories from mainstream sites, unidentified sources told Gizmodo, which wrote about Facebook’s relationship with its news curators last week.

“It was absolutely bias. We were doing it subjectively,” a former curator said. (Gizmodo’s Michael Nunez interviewed several former Facebook news curators who worked for the site between 2014 and 2015. All of them chose to remain anonymous.) Still, the Gizmodo article cautions:

Stories covered by conservative outlets (like Breitbart, Washington Examiner, and Newsmax) that were trending enough to be picked up by Facebook’s algorithm were excluded unless mainstream sites like the New York Times, the BBC, and CNN covered the same stories.

Other former curators interviewed by Gizmodo denied consciously suppressing conservative news, and we were unable to determine if left-wing news topics or sources were similarly suppressed. The conservative curator described the omissions as a function of his colleagues’ judgments; there is no evidence that Facebook management mandated or was even aware of any political bias at work.

Facebook curators also said they were discouraged from posting trending stories about Facebook itself, often having to go through several layers of management to get posts approved. Facebook, in that respect, isn’t too different from some major news organizations. Bloomberg News, for example, has a policy against covering parent company Bloomberg LP and its founder.

Reactions to the Gizmodo story have been mixed. While some it just provides more evidence of the dangers of a single massive platform having outsize control over access to information, others don’t see why it’s shocking that Facebook isn’t entirely neutral.

There’s also the question of how much responsibility Facebook has to both promote stories that it thinks that people should read and suppress stories that spread misinformation, as it’s already done with with hoaxes that go viral. In that respect, the biases of human editors might be a feature, not a bug. News organizations regularly make similar editorial judgments. As Facebook becomes an increasingly central news source for its users, there’s a lot more pressure for it to act like a legitimate one.

The problem: while all publishers constantly exercise that kind of editorial judgement, there’s considerably less comfort about Facebook doing the same.

Show tags Show comments / Leave a comment
 
Join the 50,000 who get the freshest future-of-journalism news in our daily email.
Newsonomics: It’s looking like Gannett will be acquired by GateHouse — creating a newspaper megachain like the U.S. has never seen
A combined GannHouse (Gatenet?) would own 1 out of every 6 daily newspapers in America. The goal? Buy two or three more years to figure out how to make money in digital.
Local news projects rush to fill The Vindicator’s void, with the McClatchy-Google network putting down roots
“We’re ultimately trying to do this as small and nimble as possible so that we can be seeing what’s working and throw out what’s not — and quickly being able to shift in a way that’s a little bit harder when you’re working with a 150-year-old newspaper.”
Hey comment mods, you doin’ okay? A new study shows moderating uncivil comments reduces the moderator’s trust in news
“The toll of moderating uncivil comments may be much stronger for moderators putting in several hours or a full day.”