Nieman Foundation at Harvard
HOME
          
LATEST STORY
There’s now a way for journalists to verify their Bluesky accounts through their employers (while still keeping control of them)
ABOUT                    SUBSCRIBE
June 8, 2018, 8:30 a.m.
Audience & Social

How can we restore trust in news? Here are 9 takeaways from Knight-supported research

“Finding strategies for artfully conveying complex information in ways that break down attention and trust-based barriers represents the most important challenge in our politically tumultuous time.”

Editor’s note: As part of its effort to explore the root causes of the current crisis in trust in the media, the Knight Foundation is commissioning a continuing series of white papers from academics and experts. Here’s what they’ve learned so far. (Disclosure: Knight has also been a funder of Nieman Lab.)

Institutional trust is down across the board in American society (with a few notable exceptions, such as the military). But trust in the media is particularly troubling, plummeting from 72 percent in 1976 to 32 percent in 2017. There are many reasons for this decline in trust, writes Yuval Levin, but one of the problems is that the rise of social media has pushed journalists to focus on developing personal brands:

“This makes it difficult to distinguish the work of individuals from the work of institutions, and increasingly turns journalistic institutions into platforms for the personal brands of individual reporters. When the journalists’ work appears indistinguishable from grabbing a megaphone — they become harder to trust. They aren’t really asking for trust.”

[S]ocial media in particular have turned many journalists from participants in the work of institutions to managers of personal brands who carefully tend to their own public presence and presentation.

Humans are biologically wired to respond positively to information that supports their own beliefs and negatively to information that contradicts them, writes Peter Wehner. He also points out that beliefs are often tied up with personal identity, and that changing beliefs may put a people at risk of rejection from their communities.

In a sense, people see what they want to see, in order to believe what they want to believe. In addition, everyone likes to be proven right, and changing their views is an admission that they were wrong, or at least had an incomplete understanding of an issue.

As people increasingly rely on social media platforms to get information, they are at the mercy of opaque algorithms they don’t control, write Samantha Bradshaw and Philip Howard. These algorithms are optimized to maximize advertising dollars for social media platforms. Since people tend to share information that provokes strong emotions and confirm what they already believe, “The speed and scale at which content “goes viral” grows exponentially, regardless of whether or not the information it contains is true.”

[T]he filtering of information that takes place on social media is not the product of the conscious choices of human users. Rather, what we see on our social media feeds and in our Google search results is the product of calculations made by powerful algorithms and machine learning models.

The conventional wisdom these days is that we’re all trapped in filter bubbles or echo chambers, listening only to people like ourselves. But, write Andrew Guess, Benjamin Lyons, Brendan Nyhan and Jason Reifler, the reality is more nuanced. While people tend to self-report a filtered media diet, other data show that many people do not engage in political information much at all, instead choosing entertainment over news. But, that doesn’t mean there is no problem. “[P]olarized media consumption is much more common among an important segment of the public — the most politically active, knowledgeable, and engaged. This group is disproportionately visible online and in public life.”

A deep dive into the academic literature tells us that the “echo chambers” narrative captures, at most, the experience of a minority of the public. Indeed, this claim itself has ironically been amplified and distorted in a kind of echo chamber effect.

People may be predisposed to hold on to beliefs that are agreeable to them, but, they also are more likely to believe a correction if it comes from a source they think would promote an opposing opinion. However, offering a simple correction alone rarely works. Finally, even when people accept corrections, other studies show a taint persists — called a “belief echo” — by which the false belief continues to affect attitudes.

[P]eople are more likely to believe a correction if it comes from a source for whom it runs counter to personal and political interests.

There’s no one-size-fits-all way to communicate complicated information, write Erika Franklin Fowler and Natalie Jomini Stroud, but science can help. Different goals require different types of information. If we know people don’t have the time or motivation to pay attention to in-depth information on all issues, then we might encourage the use of endorsements or other cues from trusted sources. If we seek to increase participation, it is helpful to encourage citizens to join groups and to consume like-minded information, but if we want to encourage empathy or deliberation, we need more balanced information that compassionately represents others. Sometimes people learn best through experiences, or getting issue-oriented information from organizations they trust.”

Finding strategies for artfully conveying complex information in ways that break down attention and trust-based barriers represents the most important challenge in our politically tumultuous time. But it’s one we can meet, and science can help.

Americans don’t know things because they can’t be bothered to know them, the conventional wisdom says. But, lack of motivation isn’t the whole story. News stories often cover breaking news without contextual information that supply basic facts about a particular issue, whether it’s the federal budget or climate change. Experiments show that people can be open to information about complex subjects if it’s provided within context of a news report.

Emily Thorson, a professor of political science, conducted an experiment where she provided two versions of a news story about the federal budget, one with contextual information provided in a box and another without. She found that people who read the version of the article with the contextual message, when questioned, reported more accurate information about the budget than those who did not.

[C]ontextual fact-checks can be remarkably successful in correcting misperceptions. In addition, compared to fact-checks of politicians and candidates, they run a smaller risk of creating a partisan backlash.

We need a way to think about information that goes beyond “agreeable” or “disagreeable.” “I object more fundamentally to the notion that all mass affirmation is always bad, and its corollary, that unwanted or unplanned encounters are always good. In true academic fashion, I will argue that it depends: each can sometimes be good and sometimes bad,” writes Deen Freelon.

Freelon proposes a different way to think about the information that we consume: a three-dimensional “filter map”:

  • Agreeableness. This is the degree to which information fits into our preexisting opinions.
  • Truth value. This is simply whether a given message is true or false.
  • Legitimacy. While difficult to define, this “usually reduces to an opinion’s adherence to widely accepted ethical norms like freedom, equality, fairness and human rights. While there is considerable debate around what kinds of opinions comport with such principles, it can be safely said that crimes like racial discrimination, torture and arbitrary detention definitively violate them.”

Ideally, our media filters would optimize for truth and legitimacy, ensuring that both agreeable and disagreeable content and sources are included (the map’s four blue cells)…By the same token, false and illegitimate messages would be excluded, again regardless of agreeableness (the four white cells). The conceptual leap I make here is from considering disagreeableness as a virtue in itself, to distinguishing between more and less desirable types of disagreeable content. There are many claims and opinions we should rightly dismiss out of hand, but there are others we should entertain despite disagreeing with them.

“Misinformation isn’t new,” write Danielle Allen and Justin Pottle, “and our problem is not, fundamentally, one of intermingling of fact and fiction, not a confusion of propaganda with hard-bitten, fact-based policy. Instead, it’s what we now call polarization, but what founding father James Madison referred to as “faction.”

Madison wasn’t concerned about disagreement in and of itself. Rather, he thought about structural ways to bring people together despite those differences. He advocated for a large republic with a relatively small legislature in which each representative represented a wider variety of groups and individuals.

Thanks to societal challenges such as the disappearance of many local and regional newspapers, a growing concentration of people living in ideological groupings, the loss of credibility of many colleges and universities among conservatives have all contributed to undermine “the institutions whose job it is to broker the debate within the citizenry about what different people see as credible or incredible.”

Allen and Pottle suggest a number of strategies to bring Americans together in united experiences, such as instituting a national service requirement, establishing geographic lotteries for elite learning institutions, and reviving local journalism with philanthropy.

Our problem is the breakdown of institutions that facilitate valid social learning across diverse, disagreeing groups. Historically, the institutions that facilitate social learning, for example newspapers, schools, colleges and universities, have served also as anchors for shared norms of inquiry, including for the aforementioned commitment to honesty, for ideologically diverse populations.

Nancy Watzman is editor of Trust, Media & Democracy for the Knight Commission on Trust, Media and Democracy, where this piece was originally published. The commission is developing a report and recommendations on how to improve American democracy, and it is gathering public comments for its work. Here’s how to submit yours.

POSTED     June 8, 2018, 8:30 a.m.
SEE MORE ON Audience & Social
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
There’s now a way for journalists to verify their Bluesky accounts through their employers (while still keeping control of them)
It may be too late for @edwardrmurrow.cbsnews.com, @huntersthompson.rollingstone.com, or @mikewallace.60minutes.com, but today’s reporters have another way to prove who they are on the rapidly growing social network.
Tuning out TV news might be behind the decline in media trust. (No, really!)
But: “Does falling trust cause people to change their media use, or do changing media habits cause lower trust?”
News outlets push vertical video to the homepage
“It’s a much better experience if you’re not turning your phone. And people don’t turn their phones.”