Nieman Foundation at Harvard
HOME
          
LATEST STORY
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
ABOUT                    SUBSCRIBE
Oct. 26, 2021, 1:30 p.m.

As if there wasn’t enough Facebook news to digest already, another deep dive from The Washington Post this morning revealed that Facebook engineers changed the company’s algorithm to prioritize and elevate posts that elicited emoji reactions — many of which were rolled out in 2017. More specifically, the ranking algorithm treated reactions such as “angry,” “love,” “sad,” and “wow” as five times more valuable than traditional “likes” on the social media platform.

The problem with this plan for engagement: Other posts also likely to yield similar reactions were more likely to show up, only these posts were likely to also contain misinformation, spam, or forms of clickbait. One Facebook staffer, whose name was redacted in a dump of documents shared with the Securities and Exchange Commission by whistleblower and former Facebook employee Frances Haugen, had warned that this might happen; they were proven right.

According to the Post, “The company’s data scientists confirmed in 2019 that posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity and low-quality news.”

More on that:

That means Facebook for three years systematically amped up some of the worst of its platform, making it more prominent in users’ feeds and spreading it to a much wider audience. The power of the algorithmic promotion undermined the efforts of Facebook’s content moderators and integrity teams, who were fighting an uphill battle against toxic and harmful content.

This isn’t the first time that “anger” has reared its ugly head as a useful metric. Back in 2017, a report found that hyper-political publishers were especially adept at provoking the anger of their readers. And in 2019, another report found some of the effects of Facebook’s change to prioritizing meaningful interactions:

  • It has pushed up articles on divisive topics like abortion, religion, and guns;
  • politics rules; and
  • the “angry” reaction (😡) dominates many pages, with “Fox News driving the most angry reactions of anyone, with nearly double that of anyone else.”

Facebook introduced the suite of “reaction” emojis in response to a decline in people talking to each other on the social platform, according to the report. Giving the reactions five times the value of a single like was Facebook’s effort to signal that “the post had made a greater emotional impression than a like; reacting with an emoji took an extra step beyond the single click or tap of the like button.”

Mark Zuckerberg acknowledges that reactions can be used to indicate dislike.

Members of Facebook’s integrity teams raised concerns about the amplification of “anger” as a societal emotion, the documents reviewed by the Post show, but managers had a mixed record when it came to responding to these concerns.

A screenshot showing a staffer raising the question, “Quick question to play devil’s advocate: will weighting Reactions 5x stronger than Likes lead to News Feed having a higher ratio of controversial than agreeable content?”

According to the latest documents, even efforts to counteract this effect — when they were actually implemented — produced less-than-desirable results. For instance, even if Facebook employees tried to manipulate the score of a high-ranking post to get it to show up less often things often didn’t work out as planned.

If Facebook’s algorithms thought a post was bad, Facebook could cut its score in half, pushing most of instances of the post way down in users’ feeds. But a few posts could get scores as high as a billion, according to the documents. Cutting an astronomical score in half to “demote” it would still leave it with a score high enough to appear at the top of the user’s feed.

“Scary thought: civic demotions not working,” one Facebook employee noted.

The Post’s story details Facebook’s different attempts at dialing down this effect of amplifying reaction-driven posts.

When Facebook finally set the weight on the angry reaction to zero, users began to get less misinformation, less “disturbing” content and less “graphic violence,” company data scientists found. As it turned out, after years of advocacy and pushback, there wasn’t a trade-off after all. According to one of the documents, users’ level of activity on Facebook was unaffected.

Facebook’s reaction to this latest finding linking its algorithm and the prioritizing “anger” and posts that tend to invoke that emotion in users: “We continue to work to understand what content creates negative experiences, so we can reduce its distribution. This includes content that has a disproportionate amount of angry reactions, for example,” Facebook spokesperson Dani Lever told the Post.

Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
“While there is even more need for this intervention than when we began the project, the initiative needs more resources than the current team can provide.”
Is the Texas Tribune an example or an exception? A conversation with Evan Smith about earned income
“I think risk aversion is the thing that’s killing our business right now.”
The California Journalism Preservation Act would do more harm than good. Here’s how the state might better help news
“If there are resources to be put to work, we must ask where those resources should come from, who should receive them, and on what basis they should be distributed.”