Nieman Foundation at Harvard
Business Insider’s owner signed a huge OpenAI deal. ChatGPT still won’t credit the site’s biggest scoops
ABOUT                    SUBSCRIBE
March 22, 2019, 8 a.m.

After New Zealand, is it time for Facebook Live to be shut down?

“It’s past time for the company to step up and fulfill the promise founder and CEO Mark Zuckerberg made two years ago: ‘We will keep doing all we can to prevent tragedies like this from happening.'”

When word broke that the massacre in New Zealand was livestreamed on Facebook, I immediately thought of Robert Godwin Sr. In 2017, Godwin was murdered in Cleveland, and initial reports indicated that the attacker had streamed it on Facebook Live — at the time, a relatively new feature of the social network. Facebook later clarified that the graphic video was uploaded after the event, but the incident called public attention to the risks of livestreaming violence.

In the wake of Godwin’s murder, I recommended that Facebook Live broadcasts be time-delayed, at least for Facebook users who had told the company they were under 18. That way, adult users would have an opportunity to flag inappropriate content before children were exposed to it. Facebook Live has broadcast killings, as well as other serious crimes such as sexual assault, torture, and child abuse. Though the company has hired more than 3,000 additional human content moderators, Facebook has not gotten any better at keeping horrifying violence from streaming live without any filter or warning for users.

In the 24 hours after the New Zealand massacre, 1.5 million videos and images of the killings were uploaded to Facebook’s servers, the company said. Facebook highlighted the fact that 1.2 million of them “were blocked at upload.” However, as a social media researcher and educator, I heard that as an admission that 300,000 videos and images of a mass murder passed through its automated systems and were visible on the platform.

The company noted that fewer than 200 people viewed the livestream of the massacre and that, surprisingly, no users reported it to Facebook until after it ended. These details make painfully clear how dependent Facebook is on users to flag harmful content. They also suggest that people don’t know how to report inappropriate content — or don’t have confidence the company will act on the complaint.

The video that remained after the livestream ended was viewed nearly 4,000 times — which doesn’t include the many copies of the video uploaded to other sites and to Facebook by other users. It’s unclear how many of the people who saw it were minors; children as young as 13 are allowed to set up Facebook accounts and could have encountered unfiltered footage of murderous hatred. It’s past time for the company to step up and fulfill the promise founder and CEO Mark Zuckerberg made two years ago, after Godwin’s murder: “We will keep doing all we can to prevent tragedies like this from happening.”

A simple time delay

In the television industry, short time delays of a few seconds are typical during broadcasts of live events. That time allows a moderator to review the content and confirm that it’s appropriate for a broad audience.

Facebook relies on users as moderators, and most livestreams don’t have as large an audience as TV, so its delay would need to be longer — perhaps a few minutes. Only then would enough adult users have screened it and had the opportunity to report its content. Major users, including publishers and corporations, could be permitted to livestream directly after completing a training course. Facebook could even let people request a company moderator for upcoming livestreams.

Facebook has not yet taken this relatively simple step, and the reason is clear. Time delays took hold in TV only because broadcasting regulators penalized broadcasters for airing inappropriate content during live shows. There is effectively no regulation for social media companies; they change only in pursuit of profits or to minimize public outcry.

Whether and how to regulate social media is a political question, but many U.S. politicians have developed deep ties with platforms like Facebook. Some have relied on social media to collect donations, target supporters with advertising and help them get elected. Once in office, they continue to use social media to communicate with supporters in hopes of getting reelected.

Federal agencies also use social media to communicate with the public and influence people’s opinions — even in violation of U.S. law. In my view, Facebook’s role as a tool to gain, keep, and spread political power makes politicians far less likely to rein it in.

Congress has not yet taken any meaningful action to regulate social media companies. Despite strong statements from politicians and even calls for hearings about social media in response to the New Zealand attack, U.S. regulators aren’t likely to lead the way.

European Union officials are handling much of the work, especially around privacy. New Zealand’s government has stepped up, too, banning the livestream video of the mosque massacre, meaning anyone who shares it could face up to NZ$10,000 in fines and 14 years in prison. At least two people have already been arrested for sharing it online.

Facebook could — and should — act now

Much of the discussion about regulating social media has considered using antitrust and monopoly laws to force the enormous technology giants like Facebook to break up into smaller separate companies. But if it happens at all, that will be very difficult — breaking up AT&T took a decade, from the 1974 lawsuit to the 1984 launch of the “Baby Bell” companies.

In the interim, there will be many more dangerous and violent incidents that people will try to livestream. Facebook should evaluate its products’ potential for misuse and discontinue them if the effects are harmful to society.

No child should ever see the sort of “raw and visceral content” that has been produced on Facebook Live — including mass murder. I don’t think adult users should be exposed to witnessing such heinous acts either, as studies have shown that viewing graphic violence has health risks, such as post-traumatic stress.

That’s why I’m no longer recommending just a livestream delay for adolescent users — it was an appeal to protect children when more major platform changes are unlikely. But all people deserve better and safe social media. I’m now calling on Mark Zuckerberg to shut down Facebook Live in the interest of public health and safety. In my view, that feature should be restored only if the company can prove to the public — and to regulators — that its design is safer.

Handling livestreaming safely includes having more than enough professional content moderators to handle the workload. Those workers also must have appropriate access to mental health support and safe working environments, so that even Facebook employees and contractors are not unduly scarred by brutal violence posted online.

Jennifer Grygiel is an assistant professor of communications at the S.I. Newhouse School of Public Communications at Syracuse University. A version of this story appears at The Conversation.The Conversation

Photo of a livestreamer across the street from where a police officer shot and killed Philando Castile in Falcon Heights, Minnesota, July 6, 2016, by Tony Webster used under a Creative Commons license.

POSTED     March 22, 2019, 8 a.m.
Show tags
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Business Insider’s owner signed a huge OpenAI deal. ChatGPT still won’t credit the site’s biggest scoops
“We are…deeply worried that despite this partnership, OpenAI may be downplaying rather than elevating our works,” Business Insider’s union wrote in a letter to management.
How Newslaundry worked with its users to make its journalism more accessible
“If you’re doing it, do it properly. Don’t just add a few widgets, or overlay products and embeds, and call yourself accessible.”
How YouTube’s recommendations pull you away from news
Plus: News participation is declining, online and offline; making personal phone calls could help with digital-subscriber churn; and partly automated news videos seem to work with audiences.