Nieman Foundation at Harvard
The Copa, Euro, and Wimbledon finals collide on July 14. Here’s how The Athletic is preparing for its “biggest day ever.”
ABOUT                    SUBSCRIBE
Oct. 26, 2021, 1:30 p.m.

As if there wasn’t enough Facebook news to digest already, another deep dive from The Washington Post this morning revealed that Facebook engineers changed the company’s algorithm to prioritize and elevate posts that elicited emoji reactions — many of which were rolled out in 2017. More specifically, the ranking algorithm treated reactions such as “angry,” “love,” “sad,” and “wow” as five times more valuable than traditional “likes” on the social media platform.

The problem with this plan for engagement: Other posts also likely to yield similar reactions were more likely to show up, only these posts were likely to also contain misinformation, spam, or forms of clickbait. One Facebook staffer, whose name was redacted in a dump of documents shared with the Securities and Exchange Commission by whistleblower and former Facebook employee Frances Haugen, had warned that this might happen; they were proven right.

According to the Post, “The company’s data scientists confirmed in 2019 that posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity and low-quality news.”

More on that:

That means Facebook for three years systematically amped up some of the worst of its platform, making it more prominent in users’ feeds and spreading it to a much wider audience. The power of the algorithmic promotion undermined the efforts of Facebook’s content moderators and integrity teams, who were fighting an uphill battle against toxic and harmful content.

This isn’t the first time that “anger” has reared its ugly head as a useful metric. Back in 2017, a report found that hyper-political publishers were especially adept at provoking the anger of their readers. And in 2019, another report found some of the effects of Facebook’s change to prioritizing meaningful interactions:

  • It has pushed up articles on divisive topics like abortion, religion, and guns;
  • politics rules; and
  • the “angry” reaction (😡) dominates many pages, with “Fox News driving the most angry reactions of anyone, with nearly double that of anyone else.”

Facebook introduced the suite of “reaction” emojis in response to a decline in people talking to each other on the social platform, according to the report. Giving the reactions five times the value of a single like was Facebook’s effort to signal that “the post had made a greater emotional impression than a like; reacting with an emoji took an extra step beyond the single click or tap of the like button.”

Mark Zuckerberg acknowledges that reactions can be used to indicate dislike.

Members of Facebook’s integrity teams raised concerns about the amplification of “anger” as a societal emotion, the documents reviewed by the Post show, but managers had a mixed record when it came to responding to these concerns.

A screenshot showing a staffer raising the question, “Quick question to play devil’s advocate: will weighting Reactions 5x stronger than Likes lead to News Feed having a higher ratio of controversial than agreeable content?”

According to the latest documents, even efforts to counteract this effect — when they were actually implemented — produced less-than-desirable results. For instance, even if Facebook employees tried to manipulate the score of a high-ranking post to get it to show up less often things often didn’t work out as planned.

If Facebook’s algorithms thought a post was bad, Facebook could cut its score in half, pushing most of instances of the post way down in users’ feeds. But a few posts could get scores as high as a billion, according to the documents. Cutting an astronomical score in half to “demote” it would still leave it with a score high enough to appear at the top of the user’s feed.

“Scary thought: civic demotions not working,” one Facebook employee noted.

The Post’s story details Facebook’s different attempts at dialing down this effect of amplifying reaction-driven posts.

When Facebook finally set the weight on the angry reaction to zero, users began to get less misinformation, less “disturbing” content and less “graphic violence,” company data scientists found. As it turned out, after years of advocacy and pushback, there wasn’t a trade-off after all. According to one of the documents, users’ level of activity on Facebook was unaffected.

Facebook’s reaction to this latest finding linking its algorithm and the prioritizing “anger” and posts that tend to invoke that emotion in users: “We continue to work to understand what content creates negative experiences, so we can reduce its distribution. This includes content that has a disproportionate amount of angry reactions, for example,” Facebook spokesperson Dani Lever told the Post.

Show tags
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
The Copa, Euro, and Wimbledon finals collide on July 14. Here’s how The Athletic is preparing for its “biggest day ever.”
The Athletic intends to use its live coverage as a “shop window,” giving new readers a taste of what they might get if they subscribed.
Making sense of science: Using LLMs to help reporters understand complex research
Can AI models save reporters time in figuring out an unfamiliar field’s jargon?
Are you willing to pay for Prepare to be asked before year’s end
The cable news network plans to launch a new subscription product — details TBD — by the end of 2024. Will Mark Thompson repeat his New York Times success, or is CNN too different a brand to get people spending?