Nieman Foundation at Harvard
How YouTube’s recommendations pull you away from news
ABOUT                    SUBSCRIBE
Sept. 13, 2021, 12:51 p.m.

A new NYU report finds that Facebook is part of the polarization problem, but not all of it

But its recommendations to reduce polarization don’t target the people who might have the most direct influence.

If you’re not already sitting down, I recommend it, because the main takeaway from this new NYU report on social media and political polarization is downright shocking:

We conclude that social media platforms are not the main cause of rising partisan hatred, but use of these platforms intensifies divisiveness and thus contributes to its corrosive effects.

So Facebook’s a problem, but it’s not the only problem? I know, right? Mind-blowing stuff.

But more seriously, I recommend taking a look at the paper, out today from the Center for Business and Human Rights at NYU’s Stern School of Business. Not because there’s anything groundbreaking about it, but because it’s a very useful summation of current research into how big a role social media has played in :gesticulates wildly: all of this.

As co-author Justin Hendrix writes in summary elsewhere:

Absent significant reforms by the federal government and the social media companies themselves, the platforms will continue to contribute to some of the worst consequences of polarization. These include declining trust in institutions; scorn for facts; legislative dysfunction; erosion of democratic norms; and, ultimately, real-world violence, such as the January 6 insurrection.

The report, which included more than 40 interviews with researchers and other experts, highlights some of the most interesting studies published on digital polarization. Among them:

  • A study in which researchers “paid American subjects to stop using Facebook for a month, until just after the 2018 midterm elections.” They found the Facebook break “significantly reduced polarization of views on policy issues” but “didn’t reduce affective polarization” — roughly, how much each side hates the other’s guts — “in a statistically significant way.”
  • This paper examining the impact of having Republicans follow a Twitter bot that only retweeted liberals, and vice versa for Democrats. Democrats’ political views didn’t change significantly after the bot exposure, but Republicans “became substantially more conservative…Attempts to introduce people to a broad range of opposing political views on a social media site such as Twitter might be not only ineffective but counterproductive.”
  • This study that found “that Facebook’s content-ranking algorithm may limit users’ exposure to news outlets offering viewpoints contrary to their own—and thereby increase polarization.”
  • And this one which concluded that the element of social networks that increases polarization: “the fundamental design of the automated systems that run the platforms” — a.k.a., their algorithms. “Social media technology employs popularity-based algorithms that tailor content to maximize user engagement,” researchers found, that that amplifies “the contagious power of content that elicits sectarian fear or indignation.”

All in all, if you want to have a sophisticated understanding of how Facebook made your uncle so. mad. all. the. time., the first 20 or so pages of this report are as good a summary as you’ll find.

The rest of the report, though, is the NYU researchers (Paul M. Barrett and J. Grant Sims, along with Hendrix) making recommendations about how to fix it, and they’re a bit underwhelming.

Throughout the research summaries, the report does acknowledge that all that “declining trust in institutions,” “scorn for facts,” “erosion of democratic norms” stuff isn’t evenly distributed along the political spectrum. It’s a much bigger phenomenon on the right than on the left, as study after study have confirmed.

But when it comes to policy recommendations, that acknowledgement gets wrapped in the more comfortable language of bipartisanship, where the main concerns are “polarization,” “divisiveness,” “declining trust,” and other terms without any ideological valence.

Their ideas all sound reasonable enough: mandating more disclosure about platform algorithms; ask those companies to shift their algorithmic gears from “Drive Polarization” to “Reverse”; add more human content moderators and treat them better. But their No. 1 recommendation is to President Biden:

In one or more speeches, by means of a bipartisan blue-ribbon commission, or via some other high-visibility vehicle, he should tell both lawmakers and the public that to avoid the politicization of public health crises and future versions of the Capitol insurrection, we must confront online polarization and its malign effects. By demonstrating leadership in this fashion, Biden can begin to break the logjam in Congress and open a path for achieving other goals outlined here.

I don’t want to sound too cynical, but: What is the universe in which a Biden-summoned blue-ribbon commission of experts reduces polarization?

The primary manifestation of the whole phenomenon is a group of people who don’t trust experts, who think the vaccines they’ve created are secret tools of mass sterilization — people who, to an alarming degree, “believe that the U.S. government, media, and financial sector are controlled by a group of Satan-worshipping pedophiles who run a global child sex trafficking operation.” Their minds are going to be changed by putting Christine Todd Whitman and Charlie Baker on some commission that holds hearings in the nation’s Marriott ballrooms?

The NYU recommendations target Democrats in Washington and tech giants in Silicon Valley. They both have real and obvious roles to play, and there are some good ideas for them here. But there’s little to nothing asked of Republicans, conservatives, religious groups, local governments, or anyone else more closely adjacent to the people who are driving this democracy-damaging behavior. The body of the report refers to “Republicans” or “conservatives” 47 times; its recommendations don’t mention either at all.

And hey, I get it — the people who read this report by some NYU eggheads are probably the types who start salivating Pavlov-style at the mention of a blue-ribbon commission. But at a certain point, there has to be some sort of recognition that tweaked algorithms and House committee hearings can only do so much. If your uncle does ever stop thinking Hillary Clinton eats babies, it’s highly unlikely to be a sociology professor, Rockefeller Republican, or Democratic éminence grise who’ll do it.

Joshua Benton is the senior writer and former director of Nieman Lab. You can reach him via email ( or Twitter DM (@jbenton).
POSTED     Sept. 13, 2021, 12:51 p.m.
Show tags
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
How YouTube’s recommendations pull you away from news
Plus: News participation is declining, online and offline; making personal phone calls could help with digital-subscriber churn; and partly automated news videos seem to work with audiences.
Apple brings free call recording and transcription to iPhones; journalists rejoice
“There are decades where nothing happens, and there are weeks when decades happen.”
What can The Wall Street Journal’s new ad campaign tell us about its future?
The new brand campaign is aimed at younger versions of existing Journal readers. The various “It’s Your Business” ads center some of the newsroom’s edgier and more evergreen journalism.