People’s trust in media is terrible: bad. Attempts to fix that: good. Now, what the heck are they?
Facing the fallout from the presidential election, the rise of fake news, and the stark challenges of covering a presidency that aims to undermine press credibility, the U.S. journalism industry has been positively ballooning with trust-geared initiatives. So much so that it’s become hard for even us to keep them all straight. This guide will hopefully help you detangle Trusting News from the Trust Project and News Guard from the News Integrity Initiative, and more — all projects with valuable missions to go with their incredibly similar names.
The underlying context for trust in news: A Knight-Gallup report this year found that the average media trust score (1-100) for American adults ranged from numbers in the 50s for Democrats to 18 for conservative Republicans. Less than half of those surveyed could identify a news source they believe is objective. This isn’t new: Even before Donald Trump started campaigning, Gallup’s annual poll has shown a decline from 55 percent of Americans saying they had a “great deal/fair amount” of trust in mass media in 1997 to 32 percent in 2016. So, yikes.
Here’s a trustworthy rundown of what trusted trust-initiatives are trying to do to rebuild trust. Trust us.
Team: Joy Mayer, Lynn Walsh
Funders: Reynolds Journalism Institute, Democracy Fund, Knight Foundation
Participants/partners: Mainly local newsrooms, such as WCPO, the Fort Worth Star-Telegram, St. Louis Magazine; also A Plus, Religion News Service, CALmatters, Discourse Media, USA Today
Mayer researches and designs strategies that (primarily) local newsrooms can test in a monitored process for building trust among their own audiences, with a focus on social media outreach. As Mayer explained to me in our previous coverage of Trusting News, “The biggest way newsrooms in this project are having success on Facebook is by participating in the conversations that happen there and using every interaction as an opportunity to explain their credibility.”With roots in newsrooms, journalism academia, and community engagement, Mayer started Trusting News as a research project of Mizzou’s Reynolds Journalism Institute and began testing strategies with newsrooms in January 2016. Dozens of newsrooms have formally tested the strategies so far (search this database for social media ideas across two rounds of testing), though others have picked up some of her methods along the way. Newsrooms pay nothing for the strategies, though there is also no financial support given for participating.
Team: Sally Lehrman, Santa Clara University’s Markkula Center for Applied Ethics
Funders: Craig Newmark Philanthropies, Google, the Knight Foundation, the Democracy Fund, the Markkula Foundation
Participants/partners: News outlets like the Washington Post, The Economist, the Globe and Mail, Mic, and Zeit Online; tech companies like Facebook, Google, Twitter, and Bing; Institute for Nonprofit News
The Trust Project gained prominence last November, though its origins date back to 2014. It’s a commitment between major news organizations and tech companies to “provide clarity on the [news organizations’] ethics and other standards, the journalists’ backgrounds, and how they do their work.”A group of representatives from media companies developed eight “core indicators” to help the public get more context about a news organization. They’re standardized so that the outlets could integrate the format into publishers’ CMSes and site code and that search engines and social media platforms could integrate the indicators into their systems. When my colleague Laura reported on the Trust Project last fall, however, she noted that “the tech giants’ buy-in appears experimental and limited. Nobody is saying that they’ll favor Trust Project partners in their algorithms or anything like that.”
“The public can look at this and say, ‘Okay, I know more about what’s behind this organization’,” Lehrman, senior director of journalism ethics at the Markkula Center for Applied Ethics at Santa Clara University and the creator of the project, told her. “Hopefully, it will pull back the curtain on some of our practices as journalists, which, in fact, a lot of people don’t know about. And this lack of transparency is partly what creates a sense of suspicion.”
Team: Jeff Jarvis, Molly de Aguiar, CUNY Graduate School of Journalism
Funders: 19 organizations and individuals worldwide, including Facebook, Craig Newmark Philanthropies, the Knight Foundation, the Tow Foundation, the Democracy Fund, advertising exchange AppNexus, and PR companies Edelman and Weber Shandwick
Participants/partners: The following groups received grants from the initiative’s first round of funding: Arizona State University’s News Co/lab, Center for Investigative Reporting, Center for Media Engagement, EducationNC, Free Press, Listening Post Collective, Maynard Institute, OpenNews, Public Radio International, The Coral Project; Internews and the European Journalism Centre have also received funding
As we’ve reported before, “the News Integrity Initiative’s ultimate goal is to shift the focus away from what news consumers can do to improve their news literacy and instead focus efforts on specific measures that news organizations, platforms, and others can implement to improve things on their end.” That materializes in the form of grants to enliven specific ideas from the groups mentioned above, a cross-industry network of individuals who can implement these ideas, research into the issues, and events to share ideas and promote solutions.“Fundamentally, we have always seen NII as a public-service project,” de Aguiar told Nieman Lab last summer. “We want people to feel powerful — visible, valued, and engaged in their communities because they are armed with relevant and reliable news and information.”
De Aguiar also leads the initiative’s Year of Listening project, which collects case studies, research, toolkits and guides, and inspiration for community engagement and civic dialogue projects by newsrooms over the course of 2018. A big Year of Listening component is the Community Listening and Engagement Fund, developed by the News Integrity Initiative and other journalism funders like the Lenfest Institute, Knight, and Democracy Fund to provide financial aid for engagement tools Hearken and GroundSource in smaller newsrooms.
Team: Bill Adair, Duke Reporters’ Lab
Funders: Knight Foundation, Facebook Journalism Project, Craig Newmark Philanthropies
Participants/partners: Duke University, University of Texas–Arlington, Internet Archive, the Bad Idea Factory’s Truth Goggles, Cal Poly-San Luis Obispo’s Digital Democracy initiative
Formally called the Duke Tech & Check Cooperative, you might see the name Trust & News Initiative if you look at Knight’s funding page. But the Cooperative’s leader Bill Adair dreams of automated fact-checking to possibly stop the spread of falsehoods (and maybe bolster trust in real facts in the meantime).The $1.2 million in funding should keep the Cooperative alive for two years while it experiments with mining transcripts for fact-checkable claims, pop-up fact checks on online articles, and a talking point tracker. “This is largely a technology and journalism project. It’s not a social psychology project,” Adair told me before the State of the Union beta test of the Cooperative’s automated live fact-checking app.
Team: Cofounders and co-CEOs Steve Brill and L. Gordon Crovitz, executive editor Jim Warren, managing editor Eric Effron
Funders: It’s a for-profit startup with $6 million from investors (full list here) including the cofounders, Knight Foundation, advertising and PR agency Publicis Groupe, and a number of individuals
Participants/partners: Third-party organizations that look at metrics for social media and site traffic, the Trust Project (see above!), and apparently dozens of people with journalism backgrounds they’re planning to hire as analysts for rating information sources
Brill and Crovitz, of Court TV and Wall Street Journal recognition, respectively, are launching NewsGuard as a database of “nutrition labels” about news organizations. These could then be licensed to search engines, social media platforms, and/or advertisers — anyone interested in letting someone else decide what publishers are trustworthy. Ideally, this information — researched and compiled by real humans with a journalistic approach — would help those groups avoid prioritizing debunked sites, and could replace the need for platforms to make editorial judgments about news sources themselves.The database will have more than 7,500 news sources with either a red, yellow or green rating based on its journalistic qualities. The analysts will also provide a 200- to 300-word writeup on each source’s funding, its coverage, its potential special interests, and how it fits in with the rest of the news ecosystem, as my colleague Shan reported previously. Points will also be given for news sites that participate in the Trust Project.
Ken Doctor’s takeaway on NewsGuard: “NewsGuard could be a simple, elegant solution to both the honest hand-wringing and the overwrought blather about fake news. But we still have to ask the question that NewsGuard won’t answer until it has to: What color is Fox News? Wait for the next NewsGuard announcement. Can it get either Facebook or Google to sign on, or work its way upwards with a next-level player?”
Team: Frederic Filloux, Stanford students
Funders/partners: Deepnews.ai’s website lists a number of organizations as funders and/or partners, including Stanford’s JSK Fellowship (Filloux was a fellow last year), Google’s Digital News Innovation Fund, Knight Foundation, Knapyse SAS France, Feedly, Diffbot, The Guardian, the Reynolds Journalism Institute, the Trust Project, the Credibility Coalition, and Mather Economics
Filloux’s Deepnews.ai began as the News Quality Scoring Project during his time as a fellow at Stanford; I wrote about the effort last fall: “Filloux is aiming for a human/machine, subjective/objective hybrid as the still-under-construction algorithm learns. With the algorithm comparing signals and ‘qualified professionals’ manually assigning labels to news items, the scoring system should distance ‘commodity news’ (content based on pageviews, churn, etc.) from ‘value-added news’ (original reporting that involves balance, expertise, and innovation).”Filloux is designing a mechanism to assess news articles based on quantifiable and qualitative signals — like word count, freshness versus evergreen material, quote density, contextualization, the presence of a byline — in order to surface higher quality content for both readers and publishers.”
Now, the project has rebranded as Deepnews.ai, further drawing in machine learning abilities to identify and label signals of editorial quality. Students in a Stanford computer science class have been developing a deep learning model based on millions of articles from about 10 news sources. They’re also inviting people to test the qualitative-scoring side manually based on subjective signals like thoroughness, balance/fairness, the article’s lifespan, and the article’s relevancy.
Team: Reporters Without Borders (RSF)
Funders: RSF receives financial and in-kind support from a number of groups, ranging from corporations to foundations to government agencies. There’s no particular backer identified specifically for this initiative.
Partners: Agence France Presse, the European Broadcasting Union, and the Global Editors Network
The newest member of the trust pursuit — just announced this week — is a European publisher-focused effort to combat disinformation and self-regulate trust and transparency. The group is working with a number of news outlets and associations (and Google, according to its press release) to develop standards of transparency for media ownership, revenue sources, journalistic methods, and compliance to ethical norms and independence. They’re aiming for a certification process someday as well. The standards will be under construction for 12 to 18 months, though a kickoff workshop will be held May 23 in France.