Nieman Foundation at Harvard
HOME
          
LATEST STORY
Facebook’s attempts to fight fake news seem to be working. (Twitter’s? Not so much.)
ABOUT                    SUBSCRIBE
Aug. 15, 2018, 7:38 a.m.
Aggregation & Discovery

Major internet companies might want to push their own point of view, but can they also take care of misinformation please and thank you

Three-quarters of Americans surveyed say social networks should show the same set of news topics to all users, ignoring their stated interests or browsing history. (Someone should tell them about newspapers!)

So we all heard Facebook’s view on the role that major companies play in deciding who gets what news. (Really, no need to say it twice.)

But what does your average Mark or Campbell think?

According to a new survey by the Knight Foundation and Gallup, American adults feel negatively about major Internet companies tailoring information to them individually, acting as content arbitrators that enhances bias, and not being transparent about their methods. (Note: Knight has provided support to Nieman Lab in the past.) Those major internet companies in this context are Google, Yahoo, Facebook, and Twitter (surprise).

Of the 1,203 U.S. adults interviewed earlier this summer, most got their news from Google (53 percent daily/a few times a week) or Facebook (51 percent), with only 23 percent coming from Yahoo and 19 percent from Twitter. The survey’s authors kindly broke out the percentages we’ll highlight here by familiarity with algorithms, generation, and education level.

54 percent said internet content curated by the companies’ algorithms is a bad idea (including 16 percent who felt it is a “very bad” idea), while 45 percent thought it was good. But that’s mostly split along generational lines, with the young folk in favor and the Boomers against.

~ Waves ~ have been made lately in the platforms-with-political-biases realm, thanks to another memorable move by a platform leader: Twitter leader Jack Dorsey’s interview with Fox News frontman Sean Hannity. That was seen as a placation of conservatives outraged at their “censorship”, which has snowballed into Alex Jones calling Dorsey a good “patriot.” It’s notable that 71 percent of the adults surveyed thought promoting a company’s preferred political agenda was a reason for their content surfacing methods, with 43 percent of those believing it was a major reason. And 49 percent think the companies don’t do it to “help support a more informed society by connecting people to important news” (they said it, not me). Those numbers break down along party lines, with 66 percent of Republicans in the “major reason” camp, 42 percent of independents there, and only 29 of Democrats. (73 percent of those interviewed think the companies should show all people the same topics, disregarding their interests or past browsing/search history.)

Only 15 percent of respondents feel that the major internet companies are doing enough to stem the spread of misinformation, with 48 percent strongly feeling they aren’t. Through the political party lens, 19 percent of Democrats, 15 percent of independents, and 10 percent of Republicans feel the companies are doing enough against misinformation.

A whopping 80 percent of those surveyed were in favor of excluding a news item from a news feed or results page if it contains misinformation, but they’re not necessarily asking for individually-flagged posts to solve the problem (65 percent are against excluding a news item if users have complained about it). They say these opinions are based out of concern for giving people a biased picture of the news, elevating the company’s preferred points of view, constraining certain points of view, and increasing the influence of certain news organizations over their competitors. Republicans feel mostly strong about all this.

The report pointed out that a majority of respondents do not think that the displaying of certain news items is an endorsement by those companies of the accuracy of the information — but still, 43 percent do think so.

And 60 percent said they are usually aware of the news organization behind the link surfaced on Google/ Facebook/Yahoo/Twitter, with younger and/or college-educated respondents tending to be more aware than their opposites.

These charts speak for themselves:

But there’s also an interesting gender breakdown around the latter. Women think the companies should be held accountable, whereas men feel it’s the users’ responsibility to ensure Americans get an accurate and unbiased view of the news.

Illustration via Vecteezy.

POSTED     Aug. 15, 2018, 7:38 a.m.
SEE MORE ON Aggregation & Discovery
SHARE THIS STORY
   
 
Join the 45,000 who get the freshest future-of-journalism news in our daily email.
Facebook’s attempts to fight fake news seem to be working. (Twitter’s? Not so much.)
Plus: How YouTubers spread far-right beliefs (don’t just blame algorithms), and another cry for less both-sides journalism.
Public or closed? How much activity really exists? See how other news organizations’ Facebook Groups are faring
We analyzed the data of groups as large as 40,000 members and as small as 300, from international organizations to local publishers. How does yours fit in?
Here’s what the Financial Times is doing to get bossy man voice out of (okay, less prominent in) its opinion section
“She wrote a fabulous piece that did incredibly well and I think there’s no way on earth that (a) she would have submitted or (b) it would have run, before we started this stuff. It got more than double the usual number of pageviews for an opinion piece.”