Nieman Foundation at Harvard
How to b-e-e of use: Signal Cleveland hosts second annual community spelling contest
ABOUT                    SUBSCRIBE
May 27, 2021, 11:55 a.m.
Audience & Social

Nearly 40% of Americans already believe Covid-19 leaked from a lab. What happens if they turn out to be right?

Plus: “Media news consumption is by far the strongest independent predictor of QAnon beliefs.”

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

What happens when it turns out that a “conspiracy theory,” once largely derided by mainstream media sources and fact-checkers, might actually be true?

As Glenn Kessler wrote in The Washington Post this week: “The source of the coronavirus that has left more than 3 million people dead around the world remains a mystery. But in recent months the idea that it emerged from the Wuhan Institute of Virology (WIV) — once dismissed as a ridiculous conspiracy theory — has gained new credence.” On Wednesday, President Biden directed U.S. intelligence agencies to launch an investigation into the virus’s origins.

We may never know for sure whether the “lab leak” theory is true, but it isn’t crazy (David Leonhardt has a good explainer on why in The New York Times). Outside of conservative media, though, it’s been treated that way. The episode reveals “vulnerabilities in the mainstream- and liberal-media ecosystem,” Jonathan Chait wrote in New York on Wednesday.

Media coverage of the lab-leak hypothesis was a debacle, and a major source of that failure was groupthink cultivated on Twitter … While some journalists took the question seriously, many of them bluntly conflated the lab-leak hypothesis with different claims made by Trump and his allies: that the virus was originally created as a biological weapon, or even that China intentionally started the pandemic. Story after story depicted the lab-leak hypothesis as clearly false and even racist.

The conflation with other claims extended into public polling. In March, the research firm PRRI asked 5,149 American adults whether they agreed or disagreed with the statement, “The coronavirus that causes Covid-19 was developed intentionally by scientists in a lab.”

This question wasn’t the focus of the survey. Rather, it was one of a number of questions intended to gauge respondents’ support of QAnon, appearing alongside questions that asked people whether they believed the election was stolen from Donald Trump and whether they agreed with the statement that “because things have gotten so far off track, true American patriots may have to resort to violence in order to save our country.”

Here’s how respondents answered:

Do you agree or disagree with the statement “The coronavirus that causes Covid-19 was developed intentionally by scientists in a lab”?

Completely agree: 15%
Mostly agree: 24%
Mostly disagree: 26%
Completely disagree: 33%
Skipped/refused: 3%

Pew also used the lab-leak belief back in April 2020 as a way to gauge how many Americans were misinformed about Covid-19. (“Scientists have determined the virus came about naturally, but there is some uncertainty about how it first infected people.”)

These surveys begin from the premise that the lab-leak theory is incorrect. They also show that a lot of Americans believed it, and as more information emerges, it turns out that those Americans could be right.

So now what? How much does it matter if something that was treated as a kind of loony — albeit fairly widely held — belief turns out to be true, or at least turns out to not definitely be false? Are there implications for beliefs about, say, the safety of Covid vaccines? Will people who believe in other conspiracy theories feel validated? And, like, why wouldn’t they?

It’s certainly a point for the “Facebook shouldn’t be fact-checking” crowd. Facebook “will no longer take down posts claiming that Covid-19 was manmade or manufactured, a company spokesperson told Politico on Wednesday, a move that acknowledges the renewed debate about the virus’ origins.”

The entire incident is also a pretty great example for people who claim fact-checking is biased against conservatives. Politifact “archived” its fact-check from last year on the man-made hypothesis, which it rated “Pants on Fire” at the time. “When this fact-check was first published in September 2020, PolitiFact’s sources included researchers who asserted the SARS-CoV-2 virus could not have been manipulated,” the editors wrote. “That assertion is now more widely disputed. For that reason, we are removing this fact-check from our database pending a more thorough review.”

In her newsletter, Zeynep Tufekci criticized the way the Politifact fact-check was originally written:

An honest evaluation in September 2020 — before the WHO investigative trip and everything that has been revealed since — would be something along the lines of this: “We don’t know and there are a lot of conflicting opinions about this, and the evidence base is incomplete and different groups of scientists have different views. We are not in a position to assign plausibility levels because that’s what scientific debate is about and we are not scientists or investigative journalists, and we are supposed to fact-check things that are clear facts, not resolve complex scientific debates taking place in a politically-charged landscape.”

Instead, Yan is pitted against “scientists” who are presented as if they have a singular view and consensus on this topic. The idea that the coronavirus is completely zoonotic without any plausible involvement by the lab is presented as somehow being completely established beyond any reasonable doubt, and thus a conspiracy theory.

And the last sentence in the post is especially striking: social media is seen as the place where people just “parrot” Dr. Yan’s claims, misinformation which is presumably to be countered by this fact-check. However, the fact-check itself isn’t even an explanation, it’s a series of parroting itself, of what the authorities said at the time, and the post even makes some claims that are outright false.

And in his newsletter, Matt Yglesias wrote about how the dismissal of the lab-made hypothesis by the media was a “huge fuckup” even if it doesn’t have long-term policy implications: “I think it’s increasingly clear that this was a huge fiasco for the mainstream press that got way over their skis in terms of discourse-policing.”

It wasn’t a huge fiasco for everyone in the mainstream press, of course, as The Washington Post’s Emily Rauhala points out:

But the way the theory was dismissed, Yglesias writes, “illustrates the perils of expert dialogue on social media”:

Social media is truly social in the sense that it features incredible pressures to form in-groups and out-groups and then to conform to your in-group. Unless you like and admire Cotton and Pompeo and want to be known to the world as a follower of Cotton-Pompeo Thought, it is not very compelling to speak up in favor of a minority viewpoint among scientists. Why spend your day in nasty fights on Twitter when you could be doing science? Then if you secure your impression of what “the scientists” think about something from scanning Twitter, you will perceive a consensus that is not really there. If something is a 70-30 issue but the 30 are keeping their heads down, it can look like a 98-2 issue.

I do not know a lot about science, so I will not opine how generally true this may or may not be.

But in economics, which I do know well, I think it’s a big issue. If someone tweets something you agree with, it is easy to bless it with an RT or a little heart. To take issue with it is to start a fight. And conversely, it’s much more pleasant to do a tweet that is greeted with lots of RTs and little hearts rather than one that starts fights. So I know from talking to econ PhD-havers that almost everyone is disproportionately avoiding statements they believe to be locally unpopular in their community. There is just more disagreement and dissension than you would know unless you took the time to reach out to people and speak to them in a more relaxed way.

My strong suspicion is that this is true across domains of expertise, and is creating a lot of bubbles of fake consensus that can become very misleading. And I don’t have a solution.

It’s too early to know how this all is going to play out, but it is at the very least a fraught and fascinating reminder of why correcting misinformation online is so difficult and why it requires humility and caution. It’s a reminder that we need “not enough evidence” tags and that it might often be a good idea to acknowledge what you don’t know.

For conservatives who have long criticized Facebook “censorship,” the company’s change is a gift and an excellent “don’t trust the media” talking point.

In a way, it’s only one little thing. But it’s a big thing too. It’s mostly a reminder, I think, that this is all extremely complicated, and that if journalists can find a way to insert nuance and acknowledge uncertainty in their work, they’ll be better equipped, at least mentally, to deal with fallout if and when it comes. I don’t know if nuance and uncertainty are enough to convince anyone who doesn’t agree with you, probably not, but maybe there is a way to do this that doesn’t make many people end up feeling kind of duped or bad. We are working in a media environment/living in a country where it can be really hard to admit you’ve changed your mind about anything. The pandemic has provided us with many good examples of that.

Fifteen to twenty percent of American adults “mostly or completely” believe in the far-right conspiracy theory QAnon, according to a new study. Though QAnon stuff can be tough to poll, this is depressingly straightforward: They mostly or completely agree with statements like “the government, media, and financial worlds in the U.S. are controlled by a group of Satan-worshipping pedophiles who run a global child sex trafficking operation.”

The strongest predictor of whether they believe? It’s the type of media they consume, a new study from PRRI (Public Religion Research Institute), a nonprofit, nonpartisan research organization, finds.

“Even after controlling for partisanship and ideology, media news consumption is by far the strongest independent predictor of QAnon beliefs,” the researchers write. “Remarkably, those who report most trusting far-right media sources are nearly nine times more likely to be QAnon believers compared to those who most trust broadcast networks such as ABC, CBS, and NBC.”

Republicans and conservatives are much more likely to believe in QAnon than Democrats and liberals: 23% of Republicans, believe, compared to 12% of independents and 7% of Democrats.

Nearly a third of all respondents — 29% — said they “completely” or “mostly” agreed that the 2020 election was stolen from Donald Trump.

From the survey of 5,149 American adults, which was conducted online in March:

The sources that Americans turn to for news are closely linked with openness to QAnon views. Americans are most likely to say the television news sources they trust most to provide accurate information about politics and current events are the major broadcast networks (17%), such as ABC, CBS, and NBC. One in ten or more report most trusting local television news (13%), Fox News (11%), and CNN (10%). Fewer rely on public television (8%), MSNBC (5%), and far-right news networks (3%) such as One America News Network (OANN) and Newsmax. Three in ten (30%) say that they do not watch television news, and 2% report turning to some other source.

Around four in ten Americans who say they most trust far-right news outlets such as OANN and Newsmax (40%) for television news agree with the statement that “the government, media, and financial worlds in the U.S. are controlled by a group of Satan-worshipping pedophiles who run a global child sex trafficking operation.” Around one in five Americans who do not watch television news (21%) and trust Fox News (18%) agree. Around one in ten Americans or less who trust local news (12%), CNN (11%), broadcast networks such as ABC, CBS, and NBC (8%), public television (7%), and MSNBC (5%) believe this core tenet of QAnon.

Nearly half of Americans who trust far-right news (48%) and one-third who trust Fox News (34%) agree with the statement that “There is a storm coming soon that will sweep away the elites in power and restore the rightful leaders.” About one in five who do not watch television news (22%), those who report trusting local news most (18%), and those who report trusting CNN most (17%) agree with this theory. Fewer Americans who trust MSNBC (14%), broadcast news (12%) or public television (11%) agree.

Around four in ten Americans who most trust far-right news sources (42%) and around one in four who most trust Fox News (27%) agree that “Because things have gotten so far off track, true American patriots may have to resort to violence in order to save our country.” Less than one in five Americans who do not watch television news (19%) or who trust local news (16%) agree, and less than one in ten who trust CNN (9%), broadcast news (8%), public television (7%), or MSNBC (7%) agree.

Strikingly, the researchers hadn’t asked about “far-right news outlets” like OANN and Newsmax by name. In the survey, that question looked like this:

But 3% of respondents wrote them into the “Other” box, enough for the researchers to make them into their own category. “We often don’t get responses to open-ended ‘other’ boxes, much less responses that are so consistent,” PRRI research director Natalie Jackson told me on Twitter.

Photo of a Trump rally by Becker1999 used under a Creative Commons license.

Laura Hazard Owen is the editor of Nieman Lab. You can reach her via email ( or Twitter DM (@laurahazardowen).
POSTED     May 27, 2021, 11:55 a.m.
SEE MORE ON Audience & Social
Show tags
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
How to b-e-e of use: Signal Cleveland hosts second annual community spelling contest
“Listening is great, and talking to community members is great, but we also have to figure out how to be of use.”
How South Africa’s largest digital news outlet plans to cover the chaotic 2024 election
“There is definitely anticipation in the air of change — not radical change, but some change.”
Postcards and laundromat visits: The Texas Tribune audience team experiments with IRL distribution
As social platforms falter for news, a number of nonprofit outlets are rethinking distribution for impact and in-person engagement.