Nieman Foundation at Harvard
HOME
          
LATEST STORY
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
ABOUT                    SUBSCRIBE
Aug. 15, 2018, noon
Reporting & Production

Democracy is cracking and platforms are no help. What can we do about it? Some policy suggestions

Here are a few in a new Canadian report: greater transparency requirements for digital news publishers, holding social media companies legally liable for the content on their platforms, and mandatory independent audits for platform algorithms.

Platforms aren’t efficiently self-regulating. Government officials don’t know how Facebook’s advertising works (or some know it too well). The internet can be a cesspool of spiteful users and malicious bots and yeah, in some places, digital-based communities and positive connections. But what can be done?

How about requiring internet companies to be legally liable for the content appearing in their domains?

Auditing algorithms regularly and making the results publicly available?

Launching a large-scale civic literacy and critical thinking campaign?

Giving individuals greater rights over the use, mobility, and monetization of their data?

These are some of the suggestions floated in “Democracy Divided,” a new Canadian report by Public Policy Forum CEO/former Globe and Mail journalist Edward Greenspon and University of British Columbia assistant professor/Columbia Journalism School senior fellow Taylor Owen. The ideas are bold, sure, and maybe a little far-fetched — especially when viewed from the very different regulatory context of the United States — but hey, bold thinking is at least somewhere to start.

“We believe that certain behaviors need to be remedied; that digital attacks on democracy can no more be tolerated than physical ones; that one raises the likelihood of the other in any case; and that a lowering of standards simply serves to grant permission to those intent on doing harm,” they wrote.

Greenspon also authored a report last year about how Canada could strengthen its struggling news ecosystem, with 12 specific steps.

In the new report, Greenspon and Owen start with assumptions like “there is a necessary role for policy; self-regulation is insufficient on its own” and “elected representatives have a responsibility to ensure the public sphere does not become polluted with disinformation and hate by setting rules, not by serving as regulators.”

(Side note: In a survey with results out today, though from Canada’s southern neighbor, internet users narrowly opted for companies to be held accountable for accurate and unbiased information rather than for the government to get involved. But a third felt that the users should be responsible instead.)

The recommendations also push for more transparency and accountability in the platforms and companies that contain the vast majority of public dialogue today. These include:

“The internet represents the greatest advance in communications since the printing press, but its consolidation by a handful of giant global companies and the exploitation of its vulnerabilities by individuals and organizations intent on destabilizing our democracy have reversed its early promise and challenged the public interest,” Greenspon and Owen wrote. Read the full report here.

POSTED     Aug. 15, 2018, noon
SEE MORE ON Reporting & Production
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
Within days of visiting the pages — and without commenting on, liking, or following any of the material — Facebook’s algorithm recommended reams of other AI-generated content.
What journalists and independent creators can learn from each other
“The question is not about the topics but how you approach the topics.”
Deepfake detection improves when using algorithms that are more aware of demographic diversity
“Our research addresses deepfake detection algorithms’ fairness, rather than just attempting to balance the data. It offers a new approach to algorithm design that considers demographic fairness as a core aspect.”