Nieman Foundation at Harvard
HOME
          
LATEST STORY
Pageviews, assemble! Why there’s no escaping the Marvel Cinematic Universe online
ABOUT                    SUBSCRIBE
Feb. 11, 2021, 10:58 a.m.
Audience & Social

Some Facebook users will “temporarily” see less political content in their feeds. (They already didn’t see much.)

The tests will rely on an algorithm to detect political content. It’s unclear how it will handle breaking news articles or posts from groups.

Facebook is taking small steps toward reducing the amount of political news that some users in some countries see in their News Feeds, the company announced this week in a blog post:

Over the next few months, we’ll work to better understand peoples’ varied preferences for political content and test a number of approaches based on those insights. As a first step, we’ll temporarily reduce the distribution of political content in News Feed for a small percentage of people in Canada, Brazil and Indonesia this week, and the US in the coming weeks. During these initial tests we’ll explore a variety of ways to rank political content in people’s feeds using different signals, and then decide on the approaches we’ll use going forward. Covid-19 information from authoritative health organizations like the CDC and WHO, as well as national and regional health agencies and services from affected countries, will be exempt from these tests. Content from official government agencies and services will also be exempt.

U.S. Facebook users already don’t see that much political content in their News Feeds, according to both the company and to research we did last year. More than half the people in our survey saw no news at all. I wrote at the time:

141 of the 1,730 posts in my sample could be described as political content … In other words, in this survey, political content made up about 8% of what people saw. That’s pretty close to Facebook’s claimed 6%.

In the new test, Facebook will use “a machine learning model that is trained to look for signals of political content and predict whether a post is related to politics,” the company said in a statement.

It’s not clear how much this (again, temporary) change will affect the feeds of the small number of people who will be part of the experiment, or how Facebook’s algorithm will define “politics,” or what kinds of posts will be included in the changes: News articles and breaking political news? Memes? Political rants from relatives? Matt Kiser, who runs the politics blog and newsletter “What the Fuck Just Happened Today?” tweeted that he suspects the changes are underway already:

A lot of political discussion happens within private groups — and Facebook’s own internal analysis, made public by The Wall Street Journal in January, found that “70% of the top 100 most active U.S. Civic Groups are considered non-recommendable for issues such as hate, misinfo, bullying and harassment.” Facebook said last year and reiterated in January that it hadstopped automatically recommending new political groups to its users — though an investigation by The Markup found it was continuing to do so, and was especially likely to recommend groups to Trump voters. It’s unclear how political content from groups will be treated in the new, experimental News Feed changes.

POSTED     Feb. 11, 2021, 10:58 a.m.
SEE MORE ON Audience & Social
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Pageviews, assemble! Why there’s no escaping the Marvel Cinematic Universe online
In 2022, few pop-culture brands move the needle, so newspaper blue-bloods and recipe sites alike rally around Marvel Cinematic Universe content as their last stand.
Researchers ask: Does enforcing civility stifle online debate?
Some social scientists argue that civility is a poor metric by which to judge the quality of an online debate.
What I learned in my second year on Substack
“I truly wish every reporter could have the experience of getting a raise on the same day they produced something of value to their readers.”