Nieman Foundation at Harvard
HOME
          
LATEST STORY
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
ABOUT                    SUBSCRIBE
March 23, 2022, 10:07 a.m.
Audience & Social

If someone shares your politics, you’re less likely to block them when they post misinformation

“People give a pass to their like-minded friends who share misinformation.”

It’s a set of actions that’s probably familiar to many Facebook users by now: You see a friend — perhaps an older relative or someone you’ve lost touch with over the years — share questionable, offensive, or downright inaccurate posts, and eventually you reach for that “Unfollow” button.

A new study published last week in the Journal of Communication unpacks some of the patterns associated with this tried-and-tested method of limiting the misinformation that users opt to see when scrolling through their Facebook feeds. In the study of just under 1,000 volunteers, researchers Johannes Kaiser, Cristian Vaccari, Andrew Chadwick found that users were more likely to block those who shared misinformation when their political ideology differed from their own.

“People give a pass to their like-minded friends who share misinformation, but they are much more likely to block or unfollow friends that are not in agreement with them politically when they share misinformation on social media,” said Cristian Vaccari, professor of political communication at Loughborough University in the U.K. and an author of the study.

People whose political ideology leaned left, and especially extremely left, tended to be most likely to block users as a response to misinformation sharing. People whose ideology was more conservative tended to be more tolerant of those who shared misinformation.

The researchers recruited 986 volunteers in Germany to be a part of a simulation experiment. Why a simulation? “We didn’t conduct the experiment on Facebook because we can’t do that,” Vaccari said. “Facebook could do something very realistic with their interface, but researchers don’t have access to those tools.”

Why Germany? “Germany is very different from the United States,” said Vaccari. Germany is a parliamentary republic, and voters often have a choice of multiple parties. Right- and left-wing parties can form coalitions and “Voters are a lot less inclined to see voters and politicians from the other side in an antagonistic way, the way American voters do.” Conducting an experiment in this context would give them results, the researchers believed, that were not colored by hyperpartisan politics and polarization.

The volunteers were asked to answer a series of questions about their political beliefs and were ranked on their ideology on an 11-point scale. Volunteers were also asked to think of — and name — friends with similar and dissimilar political leanings. Vaccari and team then created fake Facebook profiles of these friends and had the volunteers look at their feeds.

Made-up news articles about two relatively non-contentious (in Germany, anyway) topics — housing and education — were posted to the feeds.

Researchers also created two versions of these fabricated articles depicting misinformation. One version was considered plausible enough to perhaps be true and the other was so outrageous as to likely be immediately recognizable as misinformation. (People were told after the experiment that the articles they saw weren’t real.)

The below simulation is an example of a pretty plausible news article, since the rent hike in question is only going up from 10% to 12%:

In contrast, the below simulation is highly implausible, given the jump in rent hike maximums from 10% to 50%:

Volunteers were then asked to respond with whether they would block the person in question, based on what they’d shared.

“We thought, the bigger lie, the more newsworthy but also the more inaccurate the post, the more likely it would be blocked by people, and that was true,” Vaccari said. Across the political spectrum, volunteers were more likely to block users when the more implausible or extreme version of the article was shared.

Still, it was “mostly people on the left that engaged in this kind of behavior, and especially those who were extremely on the left,” Vaccari said. “People on the right are much less likely to block people based on their ideological dissimilarity.”

One reason to explain these political differences, although speculative, could be the need for similar social identity: “I think it’s probably something to do with identity more than belief,” Vaccari said. “You might not believe the information shared is accurate, but you might not block that person because it’s a relationship you value.”

Another reason might be related to what previous research has shown, which is that right-wing voters tend to share more misinformation on social media. “So it might be that if you are a left-wing voter, you are used to seeing quite a lot of misinformation shared by right-wing voters that you are in contact with on social media. And so you might have become more used to blocking these people because you know they are more likely to share misinformation,” Vaccari said.

One takeaway, as previous studies about echo chambers have shown, is that such partisan tendencies in blocking could further polarize people and lead to a less diverse flow of information on social media channels. “If people are biased in favor of their own party, it may get rid of misinformation, but it also gets rid of alternate views,” Vaccari said.

Of course, this comes with all the caveats of the study: The German political context, the fact that people were asked to decide their take based on posts about non-partisan issues, and the fact that people were only shown one post in order to make their decision (“In reality, people are likely to have things accumulate before they act,” Vaccari said).

“I think that probably the most important takeaway is that there are some drawbacks to the widespread assumption that one of the best ways to protect people against disinformation is to give users tools that enable them to limit contact with other people who share misinformation,” Vaccari told me. “If people applied those tools in a politically neutral way, then there would be no problem with that argument. But the problem, as this study shows, is that people apply those blocking and unfollowing tools in a way that is partisan.”

Image of unfriending on Facebook by Oliver Dunkley is being used under a Creative Commons License.

POSTED     March 23, 2022, 10:07 a.m.
SEE MORE ON Audience & Social
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
“While there is even more need for this intervention than when we began the project, the initiative needs more resources than the current team can provide.”
Is the Texas Tribune an example or an exception? A conversation with Evan Smith about earned income
“I think risk aversion is the thing that’s killing our business right now.”
The California Journalism Preservation Act would do more harm than good. Here’s how the state might better help news
“If there are resources to be put to work, we must ask where those resources should come from, who should receive them, and on what basis they should be distributed.”