Nieman Foundation at Harvard
HOME
          
LATEST STORY
Notifications every 2 minutes: This in-depth look at how people really use WhatsApp shows why fighting fake news there is so hard
ABOUT                    SUBSCRIBE
Aug. 15, 2018, noon
Reporting & Production

Democracy is cracking and platforms are no help. What can we do about it? Some policy suggestions

Here are a few in a new Canadian report: greater transparency requirements for digital news publishers, holding social media companies legally liable for the content on their platforms, and mandatory independent audits for platform algorithms.

Platforms aren’t efficiently self-regulating. Government officials don’t know how Facebook’s advertising works (or some know it too well). The internet can be a cesspool of spiteful users and malicious bots and yeah, in some places, digital-based communities and positive connections. But what can be done?

How about requiring internet companies to be legally liable for the content appearing in their domains?

Auditing algorithms regularly and making the results publicly available?

Launching a large-scale civic literacy and critical thinking campaign?

Giving individuals greater rights over the use, mobility, and monetization of their data?

These are some of the suggestions floated in “Democracy Divided,” a new Canadian report by Public Policy Forum CEO/former Globe and Mail journalist Edward Greenspon and University of British Columbia assistant professor/Columbia Journalism School senior fellow Taylor Owen. The ideas are bold, sure, and maybe a little far-fetched — especially when viewed from the very different regulatory context of the United States — but hey, bold thinking is at least somewhere to start.

“We believe that certain behaviors need to be remedied; that digital attacks on democracy can no more be tolerated than physical ones; that one raises the likelihood of the other in any case; and that a lowering of standards simply serves to grant permission to those intent on doing harm,” they wrote.

Greenspon also authored a report last year about how Canada could strengthen its struggling news ecosystem, with 12 specific steps.

In the new report, Greenspon and Owen start with assumptions like “there is a necessary role for policy; self-regulation is insufficient on its own” and “elected representatives have a responsibility to ensure the public sphere does not become polluted with disinformation and hate by setting rules, not by serving as regulators.”

(Side note: In a survey with results out today, though from Canada’s southern neighbor, internet users narrowly opted for companies to be held accountable for accurate and unbiased information rather than for the government to get involved. But a third felt that the users should be responsible instead.)

The recommendations also push for more transparency and accountability in the platforms and companies that contain the vast majority of public dialogue today. These include:

“The internet represents the greatest advance in communications since the printing press, but its consolidation by a handful of giant global companies and the exploitation of its vulnerabilities by individuals and organizations intent on destabilizing our democracy have reversed its early promise and challenged the public interest,” Greenspon and Owen wrote. Read the full report here.

POSTED     Aug. 15, 2018, noon
SEE MORE ON Reporting & Production
SHARE THIS STORY
   
 
Join the 50,000 who get the freshest future-of-journalism news in our daily email.
Notifications every 2 minutes: This in-depth look at how people really use WhatsApp shows why fighting fake news there is so hard
“In India, citizens actively seem to be privileging breadth of information over depth…Indians at this moment are not themselves articulating any kind of anxiety about dealing with the flood of information in their phones.”
Facebook probably didn’t want to be denying it paid people to create fake news this week, but here we are
Plus: WhatsApp pays for misinformation research and a look at fake midterm-related accounts (“heavy on memes, light on language”).
How The Wall Street Journal is preparing its journalists to detect deepfakes
“We have seen this rapid rise in deep learning technology and the question is: Is that going to keep going, or is it plateauing? What’s going to happen next?”