Ed. note: Here at Nieman Lab, we’re long-time fans of the work being done at First Draft, which is working to protect communities around the world from harmful information (sign up for its daily and weekly briefings). We’re happy to share some First Draft stories with Nieman Lab readers.
Since 2016, the “field of misinformation” has been disproportionately focused on political disinformation, with emphases on both Facebook and Twitter. Globally, the larger threat has been health and science misinformation on a range of platforms.
But the field’s focus was not determined by the communities most affected by misinformation, nor by the relative harm of different types of misinformation. Instead, it was set by U.S.-based university researchers, media outlets, philanthropic institutions, and Silicon Valley-based platforms, whose obsession with election-related disinformation directed the focus of misinformation initiatives, interventions, and research projects over the past four years.
The impact of this prioritization by news and research organizations has left the U.S., and other countries, ill-equipped for the pandemic, with health authorities having to play catch-up around the challenges of misinformation, a disproportionate focus on interventions designed to slow down misleading political speech, and journalists unprepared to report on scientific research. To prepare for the growing levels of distrust in science and expertise, alongside the flood of actual misinformation we expect to see in 2021, researchers, technologists, journalists, philanthropists, and policymakers must refocus their attention to health and science communication, most notably around medicine and climate.
There are three recommendations that should be considered. The first is the need to educate journalists about science and research so they are able to adequately question press releases from academics, researchers and pharmaceutical companies when necessary.
The second is a need to educate science and health professionals about the current information ecosystem. In this fragmented, networked world, the caution and discipline that define scientific discovery are being weaponized by bad actors.
The third is the critical need to raise awareness about the harm done to communities of color globally, and how that harm has created a deep distrust of medical health professionals. The focus on misinformation should not cover up an urgent need to understand these dynamics.
For 2021, journalists and communication professionals in the field of misinformation should ensure they include necessary disclaimers when reporting on non-peer-reviewed research, and more frequently consider whether reporting on such early research benefits the public. Similarly, platforms should train fact checkers and internal content moderation teams on how to respond to health and science information. Many of the fact checkers in Facebook’s Fact-Checking Project are excellent at debunking political claims or viral misinformation. Few fact checkers have deep health and science expertise in-house, yet they are being increasingly asked to work in these fields.
The current information ecosystem is no longer structured in a linear fashion, dominated by gatekeepers using broadcast techniques to inform. Instead it is a fragmented network, where members of different communities use their own content distribution strategies and techniques for interacting and keeping one another informed.
Scientists and health communication professionals have been in the spotlight this year, and we have to learn the lessons from mistakes that have been made. These include the real-world impact of the equivocation about the efficacy of masks or the dangers of airborne transmission. There is also the need to reflect on the impact of different language choices on different communities. We need to recognize the ways in which the complexity and nuance of scientific discovery lead to confusion, and often inspire people to seek out answers on the internet, leaving them vulnerable to the conspiracy theories that provide simple, powerful explanations. We also need to communicate simply and increasingly visually, rather than via long blocks of text and PDFs.
Explaining methodology and experimental limitations will not address institutional trust concerns ingrained in Black communities. Starting in the 1930s and concluding in 1972, the United States Public Health Service collaborated with the Tuskegee Institute, a historically Black college, to study syphilis in Black men. Those who participated were never informed of their diagnosis and did not receive the free healthcare they were promised. Additionally, doctors declined to treat the participants with penicillin, despite knowing it could cure the disease.
Of the original 399 participants, 28 died of syphilis, 100 died of related complications, 40 of their wives became infected, and 19 of their children were born with congenital syphilis, creating generational harm. These concerns have spread to online spaces, where users fear that Black people will be used as “guinea pigs” when Covid-19 vaccinations arrive.
Earlier this year, concerns were raised again over allegations of forced sterilization and hysterectomies of undocumented women in a for-profit Immigration and Customs Enforcement detention center, building off a long history of unwanted medical testing and eugenics programs. Injustices such as these can lead to increased distrust in both government and the medical health system in Black, Latinx and Indigenous communities. Several science communication initiatives have focused on Covid-19 misinformation in 2020, but health professionals must begin 2021 by acknowledging, appreciating, and discussing mistrust.
As research in West Africa later showed, efforts by the World Health Organization, the Red Cross and other global organizations to curb Ebola misinformation that didn’t take into account “historical, political, economic, and social contexts” were ineffective. Communication around health protocols such as hand washing did not result in behavioral changes because people did not view the action as a priority. Instead, they turned to trusted local sources, such as religious leaders, for direction. This occurred in the United States in the midst of the first wave of Covid-19 in the spring, where some pastors preached conspiracy theories to their congregations.
In 2018, the Ebola epidemic spread both disease and disinformation in the Democratic Republic of Congo. Citizens blamed foreigners and Western doctors for the spread of the virus, using social media platforms such as Facebook and WhatsApp. Many pushed back against safety precautions, with rumors leading to attacks on hospitals and health care workers.
Researchers, journalists and policymakers must take into account cultural and religious tradition, apprehension toward the medical health industry and government, and the role trusted local leaders play when building effective science communication strategies.
Back in March, as the social platforms took what looked like decisive action to tackle misinformation, partnering with the WHO, creating information hubs and cracking down on Covid-19-related conspiracies, many observers applauded. But it was clear that the platforms felt a sudden freedom to act around health and science misinformation. The WHO could be the arbiter of truth, unlike fact checkers trying their best to referee political speech. Health misinformation felt like an easier challenge to solve.
By April, the growth of “excessive quarantine” and anti-lockdown communities online demonstrated the naïveté of these conclusions. Health misinformation cannot be disentangled from political speech.
2020 has taught us that we should be focused on the tactics, techniques and characteristics of rumors, falsehoods and conspiracy theories. The same tactics researchers were documenting around elections emerged this year with a vengeance in the context of health and science. So in 2021, let’s learn the lessons collectively, rather than letting political psychologists decide whether misinformation can sway elections, and separately, infodemic managers at public health bodies decide whether memes influence mask wearing.
Understanding how established misleading narratives and strategies can be modified and repurposed to drive politicized agendas can help clarify and focus research, language and sourcing around medicine and climate communication in the new year.
Diara J. Townes is an investigative researcher and the community engagement lead for First Draft’s U.S. bureau. Claire Wardle leads strategy at First Draft.