Nieman Foundation at Harvard
HOME
          
LATEST STORY
Journalism scholars want to make journalism better. They’re not quite sure how.
ABOUT                    SUBSCRIBE
Feb. 11, 2021, 10:58 a.m.
Audience & Social

Some Facebook users will “temporarily” see less political content in their feeds. (They already didn’t see much.)

The tests will rely on an algorithm to detect political content. It’s unclear how it will handle breaking news articles or posts from groups.

Facebook is taking small steps toward reducing the amount of political news that some users in some countries see in their News Feeds, the company announced this week in a blog post:

Over the next few months, we’ll work to better understand peoples’ varied preferences for political content and test a number of approaches based on those insights. As a first step, we’ll temporarily reduce the distribution of political content in News Feed for a small percentage of people in Canada, Brazil and Indonesia this week, and the US in the coming weeks. During these initial tests we’ll explore a variety of ways to rank political content in people’s feeds using different signals, and then decide on the approaches we’ll use going forward. Covid-19 information from authoritative health organizations like the CDC and WHO, as well as national and regional health agencies and services from affected countries, will be exempt from these tests. Content from official government agencies and services will also be exempt.

U.S. Facebook users already don’t see that much political content in their News Feeds, according to both the company and to research we did last year. More than half the people in our survey saw no news at all. I wrote at the time:

141 of the 1,730 posts in my sample could be described as political content … In other words, in this survey, political content made up about 8% of what people saw. That’s pretty close to Facebook’s claimed 6%.

In the new test, Facebook will use “a machine learning model that is trained to look for signals of political content and predict whether a post is related to politics,” the company said in a statement.

It’s not clear how much this (again, temporary) change will affect the feeds of the small number of people who will be part of the experiment, or how Facebook’s algorithm will define “politics,” or what kinds of posts will be included in the changes: News articles and breaking political news? Memes? Political rants from relatives? Matt Kiser, who runs the politics blog and newsletter “What the Fuck Just Happened Today?” tweeted that he suspects the changes are underway already:

A lot of political discussion happens within private groups — and Facebook’s own internal analysis, made public by The Wall Street Journal in January, found that “70% of the top 100 most active U.S. Civic Groups are considered non-recommendable for issues such as hate, misinfo, bullying and harassment.” Facebook said last year and reiterated in January that it hadstopped automatically recommending new political groups to its users — though an investigation by The Markup found it was continuing to do so, and was especially likely to recommend groups to Trump voters. It’s unclear how political content from groups will be treated in the new, experimental News Feed changes.

Laura Hazard Owen is the editor of Nieman Lab. You can reach her via email (laura_owen@harvard.edu) or Twitter DM (@laurahazardowen).
POSTED     Feb. 11, 2021, 10:58 a.m.
SEE MORE ON Audience & Social
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Journalism scholars want to make journalism better. They’re not quite sure how.
Does any of this work actually matter?
Congress fights to keep AM radio in cars
The AM Radio for Every Vehicle Act is being deliberated in both houses of Congress.
Going back to the well: CNN.com, the most popular news site in the U.S., is putting up a paywall
It has a much better chance of success than CNN+ ever did. But it still has to convince people its work is distinctive enough to break out the credit card.