Nieman Foundation at Harvard
HOME
          
LATEST STORY
Don’t click this: When should news organizations use “nofollow” links?
ABOUT                    SUBSCRIBE
May 24, 2019, 1:03 p.m.

It’s not me, it’s you: Our Facebook fears are mostly about all those other gullible types

“Society may bear some responsibility, but so do individual Facebook users…Ultimately, people need to save themselves more and worry a little less about saving everyone else.”

A number of prominent figures have called for some sort of regulation of Facebook — including one of the company’s co-founders and a venture capitalist who was one of Facebook’s early backers. Much of the criticism of Facebook relates to how the company’s algorithms target users with advertising, and the “echo chambers” that can show users ideologically slanted content.

Despite the public criticism, the company has continued to post record profits. And billions of people — including more than two-thirds of American adults — continue to use the unregulated version of Facebook that exists now.

I have been studying the social dynamics of the internet for 30 years, and I suspect that what’s behind these apparent contradictions is something psychological. People know about Facebook’s problems, but each person assumes he or she is largely immune — even while imagining that everyone else is very susceptible to influence. That paradox helps explain why more than 2 billion people continue to use the site each month. And it also helps explain what’s behind the pressure to regulate.

It’s not me, it’s them

The psychological tendency at work here is called the third-person effect: the belief that a form of media doesn’t fool me, and maybe doesn’t fool you, but all those other people are sitting ducks for media effects.

Ironically, this dynamic can encourage people to support restrictions on media consumption — by others. If someone uses, say, a social media site and feels immune to its negative influences, it triggers another psychological phenomenon called the influence of presumed influence. When that happens, a person worries that everyone else is falling victim and supports efforts to protect them — even if they think they don’t need the protection themselves.

This could be why there are lots of Facebook users who complain about Facebook’s danger to others, but continue using it nevertheless. Even the Facebook-funding venture capitalist Roger McNamee, who wrote a book about how bad Facebook has become, may have fallen prey to this psychological irony. As The Washington Post reports, “despite…his disgust with the worst crimes of social media platforms…McNamee not only still owns Facebook shares…he also still counts himself among the behemoth’s more than 2 billion users. After all, McNamee acknowledges with a shrug and a smile, ‘I’ve got a book to promote.'”

Not everyone can be above average

McNamee may think he’s immune to the echo chambers and other online influences that, he warns, affect the average Facebook user. What if average Facebook users think they’re not the average Facebook user, and believe that they’re immune to the platform’s pernicious influences?

I explored this possibility in a survey of 515 adults in the U.S. who had used Facebook at least once the previous week. Participants were recruited by Qualtrics, a company that administered my survey questions. Respondents resided in all 50 states, their average age was 39, and they reported an average of just under 10 hours per week on Facebook, which they estimated to be similar to most other Facebook users.

The survey asked respondents three groups of questions. One was about how strongly they believe that Facebook affects them on a number of important social and political topics, including building a wall on the U.S.-Mexico border, expanding or repealing the Affordable Care Act, whether President Trump is doing a good job, and other major national issues.

The second group of questions asked how much each respondent believes Facebook affects others’ perceptions of those same issues — how much social media affects their idea of “the average person.” The third group of questions asked how strongly each respondent supported regulating Facebook, through a variety of possible strategies that include rulings from the Federal Trade Commission or the Federal Communications Commission, breaking up Facebook using anti-trust laws, requiring Facebook to reveal its algorithms, and other steps.

Eager to protect others

Respondents believed that Facebook affects other people’s perceptions much more strongly than it affects their own. The more they thought that others were more vulnerable than they were, the more they wanted to rein Facebook in.

People who thought they were far less affected than others and who wanted to regulate Facebook also believed more strongly that the source of the problem with Facebook lies in the power of echo chambers to repeat, amplify, and reinforce a user’s beliefs. That was true even though they would be affected by the regulations as well.

Echo chambers do exist, and they do affect people’s perceptions — even leading one person to shoot up a pizza parlor alleged to be a front for child prostitution. But research has called into question the idea that echo chambers are extremely influential over most people’s views.

In my view, it’s more important to help people understand that they are just as much at risk from Facebook as everyone else, whatever the level of risk may actually be. Society may bear some responsibility, but so do individual Facebook users. Otherwise, they’ll ignore recommendations about their own media consumption while supporting calls for sweeping regulations that may be too broad and potentially misdirected. Ultimately, people need to save themselves more and worry a little less about saving everyone else.

Joseph B. Walther is the Mark and Susan Bertelsen Presidential Chair in Technology and Society, and the director of the Center for Information Technology and Society and a distinguished professor in communication at UC Santa Barbara. This article is republished from The Conversation under a Creative Commons license.The Conversation

Illustration by Amy Williams used under a Creative Commons license.

POSTED     May 24, 2019, 1:03 p.m.
SHARE THIS STORY
   
Show comments  
Show tags
 
Join the 50,000 who get the freshest future-of-journalism news in our daily email.
Don’t click this: When should news organizations use “nofollow” links?
Plus, a new free course for online fact-checking taught via workspace app Notion.
One potential route to flagging fake news at scale: Linguistic analysis
It’s not perfect, but legitimate and faked news articles use language differently in ways that can be detected algorithmically: “On average, fake news articles use more expressions that are common in hate speech, as well as words related to sex, death, and anxiety.”
Finally, Instagram is getting fact-checked (in a limited way and just in the U.S., for now)
“The potential to prevent harm is high here, particularly with the widespread existence of health misinformation on the platform.”