Nieman Foundation at Harvard
HOME
          
LATEST STORY
BREAKING: The ways people hear about big news these days; “into a million pieces,” says source
ABOUT                    SUBSCRIBE
Oct. 26, 2021, 1:30 p.m.

As if there wasn’t enough Facebook news to digest already, another deep dive from The Washington Post this morning revealed that Facebook engineers changed the company’s algorithm to prioritize and elevate posts that elicited emoji reactions — many of which were rolled out in 2017. More specifically, the ranking algorithm treated reactions such as “angry,” “love,” “sad,” and “wow” as five times more valuable than traditional “likes” on the social media platform.

The problem with this plan for engagement: Other posts also likely to yield similar reactions were more likely to show up, only these posts were likely to also contain misinformation, spam, or forms of clickbait. One Facebook staffer, whose name was redacted in a dump of documents shared with the Securities and Exchange Commission by whistleblower and former Facebook employee Frances Haugen, had warned that this might happen; they were proven right.

According to the Post, “The company’s data scientists confirmed in 2019 that posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity and low-quality news.”

More on that:

That means Facebook for three years systematically amped up some of the worst of its platform, making it more prominent in users’ feeds and spreading it to a much wider audience. The power of the algorithmic promotion undermined the efforts of Facebook’s content moderators and integrity teams, who were fighting an uphill battle against toxic and harmful content.

This isn’t the first time that “anger” has reared its ugly head as a useful metric. Back in 2017, a report found that hyper-political publishers were especially adept at provoking the anger of their readers. And in 2019, another report found some of the effects of Facebook’s change to prioritizing meaningful interactions:

  • It has pushed up articles on divisive topics like abortion, religion, and guns;
  • politics rules; and
  • the “angry” reaction (😡) dominates many pages, with “Fox News driving the most angry reactions of anyone, with nearly double that of anyone else.”

Facebook introduced the suite of “reaction” emojis in response to a decline in people talking to each other on the social platform, according to the report. Giving the reactions five times the value of a single like was Facebook’s effort to signal that “the post had made a greater emotional impression than a like; reacting with an emoji took an extra step beyond the single click or tap of the like button.”

Mark Zuckerberg acknowledges that reactions can be used to indicate dislike.

Members of Facebook’s integrity teams raised concerns about the amplification of “anger” as a societal emotion, the documents reviewed by the Post show, but managers had a mixed record when it came to responding to these concerns.

A screenshot showing a staffer raising the question, “Quick question to play devil’s advocate: will weighting Reactions 5x stronger than Likes lead to News Feed having a higher ratio of controversial than agreeable content?”

According to the latest documents, even efforts to counteract this effect — when they were actually implemented — produced less-than-desirable results. For instance, even if Facebook employees tried to manipulate the score of a high-ranking post to get it to show up less often things often didn’t work out as planned.

If Facebook’s algorithms thought a post was bad, Facebook could cut its score in half, pushing most of instances of the post way down in users’ feeds. But a few posts could get scores as high as a billion, according to the documents. Cutting an astronomical score in half to “demote” it would still leave it with a score high enough to appear at the top of the user’s feed.

“Scary thought: civic demotions not working,” one Facebook employee noted.

The Post’s story details Facebook’s different attempts at dialing down this effect of amplifying reaction-driven posts.

When Facebook finally set the weight on the angry reaction to zero, users began to get less misinformation, less “disturbing” content and less “graphic violence,” company data scientists found. As it turned out, after years of advocacy and pushback, there wasn’t a trade-off after all. According to one of the documents, users’ level of activity on Facebook was unaffected.

Facebook’s reaction to this latest finding linking its algorithm and the prioritizing “anger” and posts that tend to invoke that emotion in users: “We continue to work to understand what content creates negative experiences, so we can reduce its distribution. This includes content that has a disproportionate amount of angry reactions, for example,” Facebook spokesperson Dani Lever told the Post.

Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
BREAKING: The ways people hear about big news these days; “into a million pieces,” says source
The New York Times and the Washington Post compete with meme accounts for the chance to be first with a big headline.
In 1924, a magazine ran a contest: “Who is to pay for broadcasting and how?” A century later, we’re still asking the same question
Radio Broadcast received close to a thousand entries to its contest — but ultimately rejected them all.
You’re more likely to believe fake news shared by someone you barely know than by your best friend
“The strength of weak ties” applies to misinformation, too.