Nieman Foundation at Harvard
HOME
          
LATEST STORY
Dow Jones negotiates AI usage agreements with nearly 4,000 news publishers
ABOUT                    SUBSCRIBE
Nov. 15, 2021, 2:58 p.m.

How do you fix an “information disorder”? The Aspen Institute has some ideas

“Understanding the root problems of information disorder requires understanding hard-wired human behaviors, economic and political policy, group psychology and ideologies, and the relationship to people’s sense of individual and community identity.”

It is fall now, and out west, the aspens are turning. In this case, it’s the Aspen Institute’s Commission on Information Disorder, a group of smart, powerful, and/or ex-royal people tasked with figuring out how to tackle the misinformation seemingly endemic to modern digital life.

(Among the names that’ll be most familiar to Nieman Lab readers: Amanda Zamora, Alex Stamos, Kate Starbird, Jameel Jaffer, Safiya Umoja Noble, Deb Roy, Katie Couric, Henry C. A. D. Windsor, and Kathryn Murdoch.)1

Today, they turned out their final report, summing up their findings and making 15 recommendations for improvement — what it calls “key, measurable actions.”

Before we get to the commission’s recommendations, let’s look at its summary of the status quo, what the report calls “key insights and context,” with excerpts from each.

Disinformation is a symptom; the disease is complex structural inequities.

Mis- and disinformation do not exist in a vacuum. The spread of false and misleading narratives, the incitement of division and hate, and the erosion of trust have a long history, with corporate, state actor, and political persuasion techniques employed to maintain power and profit, create harm, and/or advance political or ideological goals. Malicious actors use cheap and accessible methods to deliberately spread and amplify harmful information…False narratives can sow division, hamper public health initiatives, undermine elections, or deliver fresh marks to grifters and profiteers, and they capitalize on deep-rooted problems within American society. Disinformation pours lighter fluid on the sparks of discord that exist in every community.

The absence of clear leadership is slowing responses.

Currently, the U.S. lacks any strategic approach and clear leadership in either the public or the private sector to address information disorder…Congress, meanwhile, remains woefully under-informed about the titanic changes transforming modern life and has under-invested in the staff, talent, and knowledge to understand and legislate these new realms, particularly given that it has never replaced the capability lost by the closure of the Office of Technology Assessment in the 1990s. The technology industry lobby also has outsized influence in shaping legislative priorities favorable to its interests…More than any single action or implementable recommendation we could make, it is necessary for our government, civil society, and private sector leaders to prioritize, commit to, and follow-through on addressing the worst harms and worst actors and to invest in their own capacities to understand and respond to the problems we face together.

Trade-offs between speech and misinformation are not easy.

Should platforms be responsible for user-generated content? If so, under what circumstances? What exactly would responsibility look like? These questions are deeply contested within legal and policy debates…Despite pleas by “Big Tech” to be regulated around speech issues, it has repeatedly sought to push the task back to lawmakers to devise rules and regulations that will respect free speech while protecting consumers from harm, often under the frameworks most favorable to the industry. This strategy also ensures that tech companies can continue to exploit the lack of such constraints to their benefit until new regulations are in place.

Disinfo doesn’t just deceive; it provides permission: supply meets the demand.

Understanding the root problems of information disorder requires understanding hard-wired human behaviors, economic and political policy, group psychology and ideologies, and the relationship to people’s sense of individual and community identity…One of the most challenging aspects of addressing information disorder is confronting the reality that “disinformation” and information campaigns by bad actors don’t magically create bigotry, misogyny, racism, or intolerance — instead, such efforts are often about giving readers and consumers permission to believe things they were already predisposed to believe. There is a “demand” for disinformation (amplified and driven by product designs, to be sure), but reckoning with our problems online will require taking a hard look at our society offline.

The platforms’ lack of transparency is hampering solutions.

Over a century ago, Justice Louis Brandeis promised that “sunlight is said to be the best of disinfectants,” and online today, it’s clear we have far too little. Understanding both the behaviors of users, platforms, and algorithms and the resulting impacts of information disorder requires much more data. Critical research on disinformation — whether it be the efficacy of digital ads or the various online content moderation policies — is undercut by a lack of access to data and processes. This includes information regarding what messages are shared at scale and by whom, whether they are paid, and how they are targeted.

Online incentives drive ad revenue, not better public discourse.

[Targeted programmatic advertising] has proven fantastically profitable, and tech companies like Google and Facebook sit at the top of Wall Street markets, richly rewarded for their ability to translate consumer attention into dollars…Ads are not just about selling toothpaste or better mousetraps either; platform tools have made it possible to amplify content to narrow segments of the population, often for political purposes. Advertising tools provided by platforms can include or exclude specific users, creating a powerful, unaccountable, and often untraceable method of targeting misinformation.

Broken norms allow bad actors to flourish.

One of the most difficult areas to address in an American context is today’s shifting norms around falsehoods and misrepresentation of facts among prominent public figures. Politicians, CEOS, news anchors, talk radio hosts, and professionals can abuse their prominent roles and high degrees of reach for both personal and partisan gain. This trend is exacerbated by a political and business environment that offers fewer and fewer consequences for these actions. In short, in the public and business sphere at least, leaders have had to contend with the risk that they would be punished and distrusted by voters or consumers if caught in a lie. Today, though, they’re increasingly celebrated for their lies and mistruths — and punished, politically, for not ascribing to others’ falsehoods.

Local media has withered, while cable and digital are unaccountable.

A free and democratic society requires access to robust, independent, and trustworthy media institutions. The distrust we see today, which fluctuates across types of media, and different groups, has been decades in the making, for varied, well-documented reasons-from the decline of quality reporting in the face of the collapse of traditional economic models, to the rise of partisan or bad faith publishers at the national and local level, to the failures or reporting in the lead up to war, to a lack of diversity in newsrooms that may result in misrepresentation of the experiences of Black and other minority communities.

All reasonable enough; personally, I think No. 1 and No. 4 have usually been underestimated in public discussion of this misinfo moment. But a commission like this can’t just summarize — there must be recommendations! Aspen has 15, broken into three overarching categories: recommendations to increase transparency, to build trust, and to reduce harms.

To increase transparency

Public interest research: “Implement protections for researchers and journalists who violate platform terms of service by responsibly conducting research on public data of civic interest.” And: “Require platforms to disclose certain categories of private data to qualified academic researchers, so long as that research respects user privacy, does not endanger platform integrity, and remains in the public interest.”

High reach content disclosure: “Create a legal requirement for all social media platforms to regularly publish the content, source accounts, reach and impression data for posts that they organically deliver to large audiences.”

Content moderation platform disclosure: “Require social media platforms to disclose information about their content moderation policies and practices, and produce a time-limited archive of moderated content in a standardized format, available to authorized researchers.”

Ad transparency: “Require social media companies to regularly disclose, in a standardized format, key information about every digital ad and paid post that runs on their platforms.”

To build trust

Truth and transformation: Endorse efforts that focus on exposing how historical and current imbalances of power, access, and equity are manufactured and propagated further with mis- and disinformation — and on promoting community-led solutions to forging social bonds.”

Healthy digital discourse: Develop and scale communication tools, networks, and platforms that are designed to bridge divides, build empathy, and strengthen trust among communities.”

Workforce diversity: Increase investment and transparency to further diversity at social media platform companies and news media as a means to mitigate misinformation arising from uninformed and disconnected centers of power.”

Local media investment: “Promote substantial, long-term investment in local journalism that informs and empowers citizens, especially in underserved and marginalized communities.”

Accountability norms: “Promote new norms that create personal and professional consequences within communities and networks for individuals who willfully violate the public trust and use their privilege to harm the public.”

Election information security: “Improve U.S. election security and restore voter confidence with improved education, transparency, and resiliency.”

To reduce harms

Comprehensive federal approach: “Establish a comprehensive strategic approach to countering disinformation and the spread of misinformation, including a centralized national response strategy, clearly defined roles and responsibilities across the Executive Branch, and identified gaps in authorities and capabilities.”

Public Restoration Fund: “Create an independent organization, with a mandate to develop systemic misinformation countermeasures through education, research, and investment in local institutions.”

Civic empowerment: “Invest and innovate in online education and platform product features to increase users’ awareness of and resilience to online misinformation.”

Superspreader accountability: “Hold superspreaders of mis-and disinformation to account with clear, transparent, and consistently applied policies that enable quicker, more decisive actions and penalties, commensurate with their impactsregardless of location, or political views, or role in society.”

Amendments to Section 230 of the Communications Decency Act of 1996: “Withdraw platform immunity for content that is promoted through paid advertising and post promotion.” And: “Remove immunity as it relates to the implementation of product features, recommendation engines, and design.”

A few quick thoughts on what I think is overall a solid report, one that had me nodding a lot more often than shaking my head:

  • The transparency requirements recommended — “CrowdTangle, but mandated by the state” — would certainly be helpful to researchers and journalists. But I do wonder about the justification for applying them to social media platforms but not…everyone else who publishes content online. (“Platforms will be required by law to report out on content that produces a high degree of reach and engagement on their services.”) If my social startup Brztlskp has to report on what content its users are finding and consuming, why wouldn’t online publishers — your local daily, Fox News, ESPN, Newsmax, whoever — be subject to the same?

    Obviously, your local daily doesn’t have the reach of Facebook. But I wonder where the legal line gets drawn between “publishers” and “the large tech platforms we seem to be trying to legally rebrand as publishers.” It’s one thing to mandate transparency around moderated content — where at least there is a corporate action to be evaluated — but it’s another to do it around merely popular content.

  • The trust recommendations seem a little…airy. Color me skeptical about “new and emerging initiatives like Pol.is, Local Voices Network, and the Front Porch Forum” that “demonstrate the possibility of a new class of platforms in which purposeful design combined with intentional adoption by communities of users can provide communication spaces that are well suited to civic dialogue and understanding.” Putting 20 Trump voters and 20 Clinton voters in the same room for a catered and IRB-approved lunch doesn’t scale, and it doesn’t attract the participants who would need it most. (Maybe you should color me cynical instead.)

    Improving workforce diversity is a great idea! Just as it has been for the last 284 commissions to recommend it. Of all these issues, mistrust is the one I suspect is least vulnerable to commission-driven external action, alas.

    (You know what else is a great idea? Improving election security. But making that a key recommendation here implies that it’s somehow weak election systems that are responsible for the boom in election-fraud misinformation, and that “improving election security” would thus reduce the cries of electoral theft. It’s not as if the machines suddenly became vulnerable to bamboo-based fraud the minute Donald Trump decided to run for office. I suspect a big government effort to “improve election security” would, if anything, further harden the convictions of Republicans who believe recent elections were really “stolen” from them. It’d be like announcing a big government program to “improve microchip scarcity in Covid-19 vaccines.”)

  • The reducing-harms recs seem to have a similar issue — Big Projects Endorsed by Aspen Grandees that target people who mistrust and borderline loathe Big Projects Endorsed by Aspen Grandees. I sincerely doubt a Biden Administration “centralized national response strategy to effectively counter mis- and disinformation” would make people find “elites” more trustworthy.

    That said, I do really like one element of the “superspreader accountability” item: Platforms should be incentivized to spend more resources focused on big, popular accounts with massive reach, rather than the status quo of often giving those big, popular accounts extra undeserved leeway. And the recommended changes to Section 230 are more sensible than what either Democrats or Republicans in Congress have talked about.

It’s a good report, worth reading. If I seem less than wowed by some of the recommendations — as I seem to have an unappealing habit of being — it’s honestly not because I have a list of 62 better ideas on a legal pad somewhere. I…don’t!

I just think the “information disorder” is both (a) a very real issue that naturally attracts the attention of Big Commissions and Big Think Tanks and Big Reports, and (b) a problem that is uniquely immune to Big Commissions and Big Think Tanks and Big Reports.

This report nails it when it notes that:

Mis- and disinformation are not the root causes of society’s ills but, rather, expose society’s failures to overcome systemic problems, such as income inequality, racism, and corruption, which can be exploited to promote false information online.

The Internet is an amplifier. It increases both the reach and awareness of society’s ills. As long as the root causes exist — and as long as there are people who seek power, wealth, or fame through exploiting them — things will keep getting louder.

  1. Dig deeper into the report and you’ll also see names like Jay Rosen, danah boyd, Joan Donovan, Julia Angwin, Brendan Nyhan, Ethan Zuckerman, Margaret Sullivan, Lauren Williams, Steve Waldman, and Farai Chideya. And the head of Aspen Digital, from whence the commission was born, is Vivian Schiller. []
Joshua Benton is the senior writer and former director of Nieman Lab. You can reach him via email (joshua_benton@harvard.edu) or Twitter DM (@jbenton).
POSTED     Nov. 15, 2021, 2:58 p.m.
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Dow Jones negotiates AI usage agreements with nearly 4,000 news publishers
Earlier this year, the WSJ owner sued Perplexity for failing to properly license its content. Now its research tool Factiva has negotiated its own AI licensing deals.
Back to the bundle
“If media companies can’t figure out how to be the bundlers, other layers of the ecosystem — telecoms, devices, social platforms — will.”
Religious-sounding language will be everywhere in 2025
“A great deal of language that looks a lot like Christian Nationalism isn’t actually calling for theocracy; it is secular minoritarianism pushed by secular people, often linked to rightwing cable and other media with zero meaningful ties to the church or theological principle.”