Nieman Foundation at Harvard
After criticism over “viewpoint diversity,” NPR adds new layers of editorial oversight
ABOUT                    SUBSCRIBE
Nov. 7, 2018, 10 a.m.
Audience & Social

YouTube helps a majority of American users understand current events — but 64 percent say they see untrue info

When a one-hour outage on the platform can result in a 20 percent net hike in traffic to publishers’ websites, YouTube’s got a special share of the attention economy.

As much as we rag (mmm, rightfully!) on the major tech platforms for their algorithms getting “don’t amplify disinformation” wrong, YouTube as a platform occupies a very peculiar spot. Unlike its more social peers, YouTube isn’t primarily about making meaningful connections, snippets of snark, or perfected selfies. It’s closer to a pure consumption platform, at least the way most people use it, and it’s unusually directed toward usefulness.

Are you actually wasting time on YouTube when you’re watching a cooking video instead of scrolling/tapping mindlessly through one of your various News Feeds elsewhere? Is it pacifying your grabby infant so you can be an adult and clean the bathroom? Are you going to learn how to knit or repair something in your home any other way? See, useful.

YouTube, which recorded 1.5 billion monthly logged-in users last year, also has the downsides of drawing some users into more extreme-content rabbit holes, surfacing disturbing videos on the kid-friendly version of the platform, and amplifying creators like those Paul brothers who stupidly vlog from Japanese forests. Not so useful.

Still, when a one-hour outage on the platform can result in a 20 percent jump in traffic to publishers’ websites (compared to a 2.3 percent increase when Facebook was down), YouTube’s got a special share of the attention economy.

The Pew Research Center has new data on just how useful YouTube is — including its recommendations algorithm, which apparently drives 70 percent of consumption. 35 percent of all U.S. adults use YouTube, and 51 percent of those say YouTube has helped learn how to do something for the first time, according to a new report drawing on 4,500 Americans. The percentage of YouTube users who say they get news or headlines there has doubled since 2013 (38 percent today, compared to 20 percent then).

YouTube also plays a big role in occupying those who aren’t yet of reading age. 81 percent of all parents with kids age 11 and under have used YouTube to placate their spawn at least once; more than a third allow their kid to watch videos on the platform regularly. The Pew report points out that YouTube, by YouTube/Google’s own policies, is intended for those age 13 and older, though YouTube Kids is supposed to be a safer version of the platform.

There’s still plenty of questionable content on YouTube, and a majority of respondents noted that they often encounter “troubling or problematic” videos. 60 percent told Pew that they end up watching videos of “dangerous or troubling behavior,” and 64 percent see videos that “seem obviously false or untrue.” This persists in the kids content as well: One example The New York Times highlighted was a three-year-old boy coming across “PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized.” This is pretty much the opposite of useful.

Crises like the PAW Patrol incident uncovered by the Times, not to mention a whipsawing 2017 for the platform — The Verge highlighted the downfall of its biggest star, the apparently anti-Semitic gamer PewDiePie, and a near-boycott from big brands whose advertising was running alongside racist videos — spurred YouTube to release a transparency report in May. Users have always had the opportunity to flag inappropriate content, as we wrote at the time, but it turns out YouTube didn’t rely too heavily on those signals:

YouTube’s latest transparency report tells us a great deal about how user flags now matter to its content moderation process — and it’s not much. Clearly, automated software designed to detect possible violations and “flag” them for review do the majority of the work. In the three-month period between October and December 2017, 8.2 million videos were removed; 80 percent of those removed were flagged by software, 13 percent by trusted flaggers, and only 4 percent by regular users. Strikingly, 75 percent of the videos removed were gone before they’d been viewed even once, which means they simply could not have been flagged by a user.

On the other hand, according to this data, YouTube received 9.3 million flags in the same three months, 94 percent from regular users. But those flags led to very few removals. In the report, YouTube is diplomatic about the value of these flags: “user flags are critical to identifying some violative content that needs to be removed, but users also flag lots of benign content, which is why trained reviewers and systems are critical to ensure we only act on videos that violate our policies.”

Pew researchers also explored the recommendation algorithm, which 81 percent of those polled say at least “occasionally” drives their video consumption choices. Here’s what they found:

  • 28 percent of the videos they encountered were recommended multiple times, “suggesting that the recommendation algorithm points viewers to a consistent set of videos with some regularity.”
  • YouTube recommends longer and longer content over time. The researchers started with videos that were 9:31 long, on average, and by the fourth recommendation were directed to a nearly 15-minute-long video.
  • The algorithm also pointed users toward more and more popular videos. More than two-thirds of the recommended videos had more than 1 million views. The average number of views per recommended video went from 8 million in the starting round to 30 million views on average in the first recommended video and more than 40 million views on average at the fourth recommended video.

Video has not proven effective as the next! hot! thing! for publishers to pivot to, as demonstrated by Facebook’s video hype-and-fail. But the YouTube niche is there, and it’s definitely not cold. Nearly one in five respondents told Pew YouTube helps them understand things happening in the world — you know, current events and news, to name a few.

Earlier this year, YouTube announced its plan for improving the platform’s news discovery experience. It includes $25 million in grants for news organizations to build out their video operations and experiments with boosting local news in YouTube’s connected TV app — not to mention adding text-based news article snippets from “authoritative sources” alongside search results in breaking situations — but TBD on that initiative’s success. If YouTube really wants to be the most useful platform, it might want to make sure it’s not scarring children for the rest of their lives or radicalizing someone who just wants to learn how to clean a gun.

Image from geralt used under a Creative Commons license.

POSTED     Nov. 7, 2018, 10 a.m.
SEE MORE ON Audience & Social
Show tags
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
After criticism over “viewpoint diversity,” NPR adds new layers of editorial oversight
“We will all have to adjust to a new workflow. If it is a bottleneck, it will be a failure.”
“Impossible to approach the reporting the way I normally would”: How Rachel Aviv wrote that New Yorker story on Lucy Letby
“So much of the media coverage — and the trial itself — started at the point at which we’ve determined that [Lucy] Letby is an evil murderer; all her texts, notes, and movements are then viewed through that lens.”
Increasingly stress-inducing subject lines helped The Intercept surpass its fundraising goal
“We feel like we really owe it to our readers to be honest about the stakes and to let them know that we truly cannot do this work without them.”