Nieman Foundation at Harvard
HOME
          
LATEST STORY
Lessons learned in The Building of Lost Causes
ABOUT                    SUBSCRIBE
Jan. 21, 2015, 3:27 p.m.

What does Facebook’s new tool for fighting fake news mean for real publishers?

Tweaks to the News Feed algorithm that push back against satire and hoaxes suggest that Facebook wants better content — but is pushing responsibility on its users.

Facebook announced yet another tweak to the algorithm that governs its users’ News Feeds yesterday. The social network has introduced a new tool that allows users to flag a post as “a false news story.” The move follows a few other attempts by the platform to better delineate different types of content. For example, in August, it was reported that the company was experimenting with satire tags meant to help users differentiate between parody and news. They’ve also taken steps to push back against clickbait.

fb_newsroom_spam_360Importantly, Facebook doesn’t do any of this tagging itself. Instead, it relies on its over one billion users to recognize and label links, videos, and photos that they perceive to be hoaxes. In an email, a Facebook spokesperson emphasized that the update is merely an additional signal helping to guide the PageRank algorithm. (“This is an update to the News Feed ranking algorithm. There are no human reviewers or editors involved. We are not reviewing content and making a determination on its accuracy, and we are not taking down content reported as false.”)

Of course, there are humans involved in reviewing fake news content — just not ones who work for Facebook. But as Dartmouth assistant professor of government Brendan Nyhan suggests, at this point Facebook simply delivers too much content for its own human moderation to be feasible. “I think if they tried to put a human in the loop of the content moving through their platform, they would have to have an army,” he says. “Human moderation doesn’t scale well. Would you prefer a human doing this? I’m not sure I would. It requires a lot of background knowledge to determine what’s true and what’s false.”

https://twitter.com/MikeIsaac/status/557606741433929728

It would be an exaggeration to say that fake news sites have plagued Facebook, but links to stories containing false information meant to drive traffic do exist and can be misleading to readers across the Internet. Adrienne LaFrance, a former Nieman Lab staffer now a senior associate editor at The Atlantic, started a column called Antiviral at Gawker a year ago that was aimed at debunking viral hoaxes. She says users might not always find it as easy as Facebook expects to tell truth from fiction.

“Facebook is adding a layer of what looks like editorial accountability without actually taking on the responsibility of figuring out what’s true,” she wrote in an email. “So Facebook gives the impression that it is an editorial gatekeeper, but there’s still this buffer that protects Facebook from having to actually explain its thinking the way a newsroom would have to.”

Of course, Facebook isn’t taking aim at mainstream news outlets that get duped by hoaxers with this measure; their target is much more narrow. From the press release: 

The vast majority of publishers on Facebook will not be impacted by this update. A small set of publishers who are frequently posting hoaxes and scams will see their distribution decrease.

Craig Silverman, a fellow at Columbia’s Tow Center, recently founded Emergent.info, a “a real-time rumor tracker” that “aims to develop best practices for debunking misinformation.” He’d reached out to Facebook before yesterday’s announcement in the hopes that they would take some kind of action against these sites that deliberately circulate false information.

“What they really try to do is jump on things that are already in the news, or celebrities — stuff that has some level of consciousness in the public,” Silverman says of these sites. “They say, based on the story that’s already out there, what can we do that gets a reaction out of people?” Silverman keeps a list of around 16 repeat offenders — including The Daily Currant, National Report, World News Daily Report, Empire News, ScrapeTV, and more — which he sent to Facebook, knowing they wouldn’t blacklist the sites, but hoping they would take some sort of action.

Facebook has displayed previous interest in debunking rumors and hoaxes. In the past year, they’ve published two papers that track how rumors spread. In one study, they looked at how users reacted to having their mistaken judgment pointed out to them by friends, typically by copy-pasting a link from Snopes.com, the rumor-fighting website. What they found was that “people are two times more likely to delete hoaxes after receiving a comment from a friend about it being a hoax.”

But users are also made uncomfortable by having attention drawn to their mistake, which can decrease interaction and engagement on the site. “By debunking this stuff, you look like a kill joy. You look like a know-it-all,” says Silverman. That finding has, naturally, influenced the way Facebook built its own anti-hoax tool “They don’t want to put up barriers to sharing, or create negative experiences for people who have done the sharing,” Silverman adds. By introducing a crowd-based user tagging system that de-ranks hoax posts, rather than a more direct or aggressive approach, Facebook is attempting to maintain a sense of neutrality in the News Feed.

Facebook says the false news tag is just one in a suite of tools they use to guide its algorithm. But as long as they’re relying on automation, it’s conceivable that users could band together to abuse the tool.

Twitter has already encountered a version of this problem. In November, a New York Times story about Florida State University football players who received preferential treatment from the police was flagged as spam, which caused the URL to take readers to a warning page. Though Twitter hasn’t made clear exactly what happened, what’s evident is that user spam tags can cause errors with impact for publishers. (Twitter hadn’t gotten back to me before publication time.)

“In my research, I’ve found people can be very resistant to unwelcome information,” says Nyhan. “I wonder if people would report things as hoaxes that they don’t like. Imagine you see a story about climate change, and you don’t believe in climate change. If enough people do that, does it start monkeying with the algorithm in problematic ways?”

In response to questions about how they would deal with such an attack, a Facebook spokesperson would only say: “Reporting a story as false is another negative signal, similar to reporting a post as spam. Using a range of signals in ranking helps guard against abuse.”

If the tweak works, that will be good news for publishers who won’t have to compete as directly with fake, viral stories. Facebook, long a big driver of news traffic, grew even bigger in 2014, with many Facebook users getting little news from other sources.

Silverman said it’s important to remember Facebook’s moves are based in self-interest: a better user experience means more engaged users, which means more profit. “They want news producers and content producers to put content on Facebook and do revenue shares. They want that environment to be good for monetization,” he says. “They want people to have a good experience and not say, ‘Everything I saw on my News Feed is garbage.'”

Photo by Franco Bouly used under a Creative Commons license.

POSTED     Jan. 21, 2015, 3:27 p.m.
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Lessons learned in The Building of Lost Causes
“The skills we developed while facing down the fossil fuel industry — persistence through trolling campaigns, converting readers one by one, turning an upstart publication into essential reading — these aren’t just about journalism. They’re about how to keep building when everything around you feels like it’s crumbling.”
Blocking out the audience’s siren song
“But most governance — even extreme governance — is banal. If Project 2025 is anything to go by, journalists need to focus more on the boring minutiae of policymaking and not on the sensationalism of politics.”
Journalism education leads the change we seek
“Training the next generation of journalists means preparing them to be global citizens.”