Nieman Foundation at Harvard
HOME
          
LATEST STORY
Mississippi Today, backed by an NBC exec, aims to be the Texas Tribune of its undercovered state
ABOUT                    SUBSCRIBE
Jan. 21, 2015, 3:27 p.m.
5807598931_c920798e96_b

What does Facebook’s new tool for fighting fake news mean for real publishers?

Tweaks to the News Feed algorithm that push back against satire and hoaxes suggest that Facebook wants better content — but is pushing responsibility on its users.

Facebook announced yet another tweak to the algorithm that governs its users’ News Feeds yesterday. The social network has introduced a new tool that allows users to flag a post as “a false news story.” The move follows a few other attempts by the platform to better delineate different types of content. For example, in August, it was reported that the company was experimenting with satire tags meant to help users differentiate between parody and news. They’ve also taken steps to push back against clickbait.

fb_newsroom_spam_360Importantly, Facebook doesn’t do any of this tagging itself. Instead, it relies on its over one billion users to recognize and label links, videos, and photos that they perceive to be hoaxes. In an email, a Facebook spokesperson emphasized that the update is merely an additional signal helping to guide the PageRank algorithm. (“This is an update to the News Feed ranking algorithm. There are no human reviewers or editors involved. We are not reviewing content and making a determination on its accuracy, and we are not taking down content reported as false.”)

Of course, there are humans involved in reviewing fake news content — just not ones who work for Facebook. But as Dartmouth assistant professor of government Brendan Nyhan suggests, at this point Facebook simply delivers too much content for its own human moderation to be feasible. “I think if they tried to put a human in the loop of the content moving through their platform, they would have to have an army,” he says. “Human moderation doesn’t scale well. Would you prefer a human doing this? I’m not sure I would. It requires a lot of background knowledge to determine what’s true and what’s false.”

It would be an exaggeration to say that fake news sites have plagued Facebook, but links to stories containing false information meant to drive traffic do exist and can be misleading to readers across the Internet. Adrienne LaFrance, a former Nieman Lab staffer now a senior associate editor at The Atlantic, started a column called Antiviral at Gawker a year ago that was aimed at debunking viral hoaxes. She says users might not always find it as easy as Facebook expects to tell truth from fiction.

“Facebook is adding a layer of what looks like editorial accountability without actually taking on the responsibility of figuring out what’s true,” she wrote in an email. “So Facebook gives the impression that it is an editorial gatekeeper, but there’s still this buffer that protects Facebook from having to actually explain its thinking the way a newsroom would have to.”

Of course, Facebook isn’t taking aim at mainstream news outlets that get duped by hoaxers with this measure; their target is much more narrow. From the press release: 

The vast majority of publishers on Facebook will not be impacted by this update. A small set of publishers who are frequently posting hoaxes and scams will see their distribution decrease.

Craig Silverman, a fellow at Columbia’s Tow Center, recently founded Emergent.info, a “a real-time rumor tracker” that “aims to develop best practices for debunking misinformation.” He’d reached out to Facebook before yesterday’s announcement in the hopes that they would take some kind of action against these sites that deliberately circulate false information.

“What they really try to do is jump on things that are already in the news, or celebrities — stuff that has some level of consciousness in the public,” Silverman says of these sites. “They say, based on the story that’s already out there, what can we do that gets a reaction out of people?” Silverman keeps a list of around 16 repeat offenders — including The Daily Currant, National Report, World News Daily Report, Empire News, ScrapeTV, and more — which he sent to Facebook, knowing they wouldn’t blacklist the sites, but hoping they would take some sort of action.

Facebook has displayed previous interest in debunking rumors and hoaxes. In the past year, they’ve published two papers that track how rumors spread. In one study, they looked at how users reacted to having their mistaken judgment pointed out to them by friends, typically by copy-pasting a link from Snopes.com, the rumor-fighting website. What they found was that “people are two times more likely to delete hoaxes after receiving a comment from a friend about it being a hoax.”

But users are also made uncomfortable by having attention drawn to their mistake, which can decrease interaction and engagement on the site. “By debunking this stuff, you look like a kill joy. You look like a know-it-all,” says Silverman. That finding has, naturally, influenced the way Facebook built its own anti-hoax tool “They don’t want to put up barriers to sharing, or create negative experiences for people who have done the sharing,” Silverman adds. By introducing a crowd-based user tagging system that de-ranks hoax posts, rather than a more direct or aggressive approach, Facebook is attempting to maintain a sense of neutrality in the News Feed.

Facebook says the false news tag is just one in a suite of tools they use to guide its algorithm. But as long as they’re relying on automation, it’s conceivable that users could band together to abuse the tool.

Twitter has already encountered a version of this problem. In November, a New York Times story about Florida State University football players who received preferential treatment from the police was flagged as spam, which caused the URL to take readers to a warning page. Though Twitter hasn’t made clear exactly what happened, what’s evident is that user spam tags can cause errors with impact for publishers. (Twitter hadn’t gotten back to me before publication time.)

“In my research, I’ve found people can be very resistant to unwelcome information,” says Nyhan. “I wonder if people would report things as hoaxes that they don’t like. Imagine you see a story about climate change, and you don’t believe in climate change. If enough people do that, does it start monkeying with the algorithm in problematic ways?”

In response to questions about how they would deal with such an attack, a Facebook spokesperson would only say: “Reporting a story as false is another negative signal, similar to reporting a post as spam. Using a range of signals in ranking helps guard against abuse.”

If the tweak works, that will be good news for publishers who won’t have to compete as directly with fake, viral stories. Facebook, long a big driver of news traffic, grew even bigger in 2014, with many Facebook users getting little news from other sources.

Silverman said it’s important to remember Facebook’s moves are based in self-interest: a better user experience means more engaged users, which means more profit. “They want news producers and content producers to put content on Facebook and do revenue shares. They want that environment to be good for monetization,” he says. “They want people to have a good experience and not say, ‘Everything I saw on my News Feed is garbage.'”

Photo by Franco Bouly used under a Creative Commons license.

POSTED     Jan. 21, 2015, 3:27 p.m.
SHARE THIS STORY
   
Show comments  
Show tags
 
Join the 15,000 who get the freshest future-of-journalism news in our daily email.
Mississippi Today, backed by an NBC exec, aims to be the Texas Tribune of its undercovered state
“Proportionally, we hope to do just as well.”
Saying publishers’ anti-adblock tactics are illegal, a European privacy advocate plans his attack
“The amount of ire and vitriol that has been thrown my way over the past four or five months is a very clear indication that [publishers are] absolutely terrified…If they want my advice on how to do it legally, they can pay me for it.”
Should it stay or should it go: News outlets scramble to cover Britain’s decision to exit the European Union
Online, readers stayed up for the results: Peak traffic to BBC News, for instance, was around 4 a.m. GMT, and by 11 a.m. BBC.com had received 88 million page views.
What to read next
0BuzzFeed’s Another Round podcast is partnering with a social audio app to let listeners submit their stories
The podcast is working with the app, Rolltape, to make it easier for listeners to submit their own audio.
0In 60 days, drone journalism will be legally possible in any U.S. newsroom
“There are still challenges, and we haven’t even talked about state and local laws that have been piling up while the FAA lumbered toward today. But the future of drones in journalism is much brighter today than it has ever been.”
0Honolulu Civil Beat, after six years of trying life as a for-profit, is becoming a nonprofit after all
The Pierre Omidyar-backed news site is dropping its paywall and launching a membership program as part of the change.
Encyclo is our encyclopedia of the future of news, chronicling the key players in journalism’s evolution.
Here are a few of the entries you’ll find in Encyclo.   Get the full Encyclo ➚
The Times of London
Twitter
The Economist
MinnPost
The Daily Voice
Bureau of Investigative Journalism
The Blaze
CBS News
WikiLeaks
BBC News
The Weekly Standard
The Dish