Nieman Foundation at Harvard
HOME
          
LATEST STORY
Ken Doctor: Six months after launching a local news company (in an Alden market), here’s what I’ve learned
ABOUT                    SUBSCRIBE
Feb. 11, 2021, 10:58 a.m.
Audience & Social

Some Facebook users will “temporarily” see less political content in their feeds. (They already didn’t see much.)

The tests will rely on an algorithm to detect political content. It’s unclear how it will handle breaking news articles or posts from groups.

Facebook is taking small steps toward reducing the amount of political news that some users in some countries see in their News Feeds, the company announced this week in a blog post:

Over the next few months, we’ll work to better understand peoples’ varied preferences for political content and test a number of approaches based on those insights. As a first step, we’ll temporarily reduce the distribution of political content in News Feed for a small percentage of people in Canada, Brazil and Indonesia this week, and the US in the coming weeks. During these initial tests we’ll explore a variety of ways to rank political content in people’s feeds using different signals, and then decide on the approaches we’ll use going forward. Covid-19 information from authoritative health organizations like the CDC and WHO, as well as national and regional health agencies and services from affected countries, will be exempt from these tests. Content from official government agencies and services will also be exempt.

U.S. Facebook users already don’t see that much political content in their News Feeds, according to both the company and to research we did last year. More than half the people in our survey saw no news at all. I wrote at the time:

141 of the 1,730 posts in my sample could be described as political content … In other words, in this survey, political content made up about 8% of what people saw. That’s pretty close to Facebook’s claimed 6%.

In the new test, Facebook will use “a machine learning model that is trained to look for signals of political content and predict whether a post is related to politics,” the company said in a statement.

It’s not clear how much this (again, temporary) change will affect the feeds of the small number of people who will be part of the experiment, or how Facebook’s algorithm will define “politics,” or what kinds of posts will be included in the changes: News articles and breaking political news? Memes? Political rants from relatives? Matt Kiser, who runs the politics blog and newsletter “What the Fuck Just Happened Today?” tweeted that he suspects the changes are underway already:

A lot of political discussion happens within private groups — and Facebook’s own internal analysis, made public by The Wall Street Journal in January, found that “70% of the top 100 most active U.S. Civic Groups are considered non-recommendable for issues such as hate, misinfo, bullying and harassment.” Facebook said last year and reiterated in January that it hadstopped automatically recommending new political groups to its users — though an investigation by The Markup found it was continuing to do so, and was especially likely to recommend groups to Trump voters. It’s unclear how political content from groups will be treated in the new, experimental News Feed changes.

POSTED     Feb. 11, 2021, 10:58 a.m.
SEE MORE ON Audience & Social
SHARE THIS STORY
   
Show comments  
Show tags
 
Join the 50,000 who get the freshest future-of-journalism news in our daily email.
Ken Doctor: Six months after launching a local news company (in an Alden market), here’s what I’ve learned
We don’t wake up each morning to compete with a print daily, but rather to run our own local news and community model. That’s the key.
The Covid vaccine and the people of Nextdoor
Nextdoor is overlooked as a player in misinformation, and to address “vaccine hesitancy” in America, you might want to start at the neighborhood level, where hesitancy has grown into militancy.
News orgs are getting creative to explain ranked-choice voting to New Yorkers
Questions from readers ranged from basic (What’s ranked choice voting?) to skeptical (Why are we even doing this?) to strategic (How can I optimize my ballot so that [Candidate X] doesn’t win?).