Nieman Foundation at Harvard
HOME
          
LATEST STORY
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
ABOUT                    SUBSCRIBE
April 15, 2019, 3:13 p.m.

As Notre Dame burned, an algorithmic error at YouTube put information about 9/11 under news videos

A reminder that even efforts to limit misinformation can end up spreading it instead — and that human editors watching over the algorithms can be a pretty good thing, too.

It’s terrible news for anyone who values history, loves Paris, or read Victor Hugo: Notre-Dame Cathedral, the Gothic gem at the historic center of Paris, is on fire. It’s obviously far too early for anything conclusive, but early suggestions from officials are that the blaze could be related the ongoing renovations to the roof. There’s no indication at this writing that it’s a terror attack or related in any way to a terrorist group.

So as people turned to YouTube to see live streams from trusted news organizations of the fire in progress, why was YouTube showing them background information about 9/11?

I first noticed this when I went to France24, which produces an English-language feed, and saw the unusual box.

Looking around, along with France24, I also saw the 9/11 info on the streams of CBS News and NBC News.

Why is YouTube adding information to these videos that seems tailor-made to make people think it’s a terror attack? I asked Google and got this statement from a spokesperson:

We are deeply saddened by the ongoing fire at the Notre Dame cathedral. Last year, we launched information panels with links to third party sources like Encyclopedia Britannica and Wikipedia for subjects subject to misinformation. These panels are triggered algorithmically and our systems sometimes make the wrong call. We are disabling these panels for live streams related to the fire.

You may remember those information panels from when they were announced at SXSW last year, with CEO Susan Wojcicki saying that:

…when there are videos that are focused around something that’s a conspiracy — and we’re using a list of well-known internet conspiracies from Wikipedia — then we will show a companion unit of information from Wikipedia showing that here is information about the event.

That well-intentioned effort faced criticism on a couple of fronts: Google’s YouTube would be freeloading on the backs of unpaid Wikipedia editors, and those info boxes (with a link to Wikipedia) risked infecting that comparatively conspiracy-resistant platform with a bunch of YouTube crazies.

YouTube has expanded that effort in a few ways over time, including showing the boxes when someone searches for conspiracy-friendly terms (even if they don’t click through to a video) and using similar methods to denote news organizations that receive government funding.

It’s unclear why a breaking news event — one about which there hasn’t been time for any substantial conspiracies to take root — got the information panel, much less a 9/11 one; Google fixed the problem less than an hour after I noticed it. But it’s a reminder that even efforts to limit misinformation can end up spreading it instead — and that human editors watching over the algorithms can be a pretty good thing, too.

UPDATE, 5:40 p.m.: A few quick followups since this story has now picked up at other sites. First, here’s a previous example of the 9/11 infobox being added to an unrelated video; KCRW’s Mario Cotto noted that some old footage of New York City from 1976 got tagged with it:

The title and description of that video doesn’t mention anything more 9/11-related than “New York” — no mention of the World Trade Center, for instance. (The infobox has since been removed.)

Then there’s this from CUNY’s Luke Waltzer: a video of his father Ken’s retirement from Michigan State, which somehow got labeled with a “Jew” infobox:

Waltzer used to head the Jewish Studies program at Michigan State, but again nothing in the title or description mentions anything Jewish.

Google’s official description of the infobox program says that it places the boxes “alongside videos on a small number of well-established historical and scientific topics that have often been subject to misinformation online, like the moon landing…This information panel will appear alongside videos related to the topic, regardless of the opinions or perspectives expressed in the videos.” I guess the algorithm it’s using considered a video about a Jewish man retiring to be sufficiently about the topic of “Jew” to merit the box, just as it considered random 40-year-old footage of New York to be “related” to 9/11.

In other words, it isn’t just that the algorithm sometimes completely misses the boat, like confusing Notre-Dame and the World Trade Center. Even when it’s not making a big categorization error, it can still be putting up very inappropriate “information.”

Some other examples: A video of a launch of the Falcon Heavy rocket got labeled with 9/11 — presumably because it showed two towers and a lot of smoke?

9/11 also got attached to a video stream promising “College Music · 24/7 Live Radio · Study Music · Chill Music · Calming Music”:

Same for a video of a random fire in San Francisco:

A couple other thoughts: Mike Caulfield rightly notes that simply linking to accurate information isn’t the best way to battle a conspiracy theory.

Bassey Etim says that while human monitoring of every topic on YouTube is obviously impractical, there’s no reason it couldn’t use humans on a first pass for this sort of stuff on the most important stories — especially the big breaking ones.

(Etim used to lead content moderation at The New York Times, so he knows the value of giving humans oversight over a small subset of the most important information judgment calls, while letting algorithms handle the rest.)

Joshua Benton is the senior writer and former director of Nieman Lab. You can reach him via email (joshua_benton@harvard.edu) or Twitter DM (@jbenton).
POSTED     April 15, 2019, 3:13 p.m.
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
“While there is even more need for this intervention than when we began the project, the initiative needs more resources than the current team can provide.”
Is the Texas Tribune an example or an exception? A conversation with Evan Smith about earned income
“I think risk aversion is the thing that’s killing our business right now.”
The California Journalism Preservation Act would do more harm than good. Here’s how the state might better help news
“If there are resources to be put to work, we must ask where those resources should come from, who should receive them, and on what basis they should be distributed.”