Prediction
Humans hold their own against the robots
Name
Rubina Madan Fillion
Excerpt
“Rather than despair, we need to find ways to highlight the people and processes behind our journalism.”
Prediction ID
527562696e61-24
 

I spent a decent amount of time conversing with chatbots this year, trying to gauge how effective they are at writing and editing. Their headlines were boring. Their social posts sounded like they were written by overcaffeinated marketers. Their copy was predictable, or it was inaccurate. They certainly couldn’t gain the trust of sources or come up with great story ideas. In short, most reporters don’t have to fear for their jobs anytime soon. Nor do editors.

That isn’t to say we shouldn’t use artificial intelligence in journalism. I wrote a Nieman Lab prediction six years ago about how Al could personalize news, make school board meetings more accessible, and free up journalists’ time to work on more complex stories. Earlier this year, The New York Times was one of the news organizations that announced it was exploring ways to use AI to support our journalists and grow our audience. “What I find exciting is how we can use AI to make the people and processes behind our work more available and accessible to more people. That’s at the heart of our work,” said Alex Hardiman, chief product officer at the Times. “I agree that we’re wholly uninterested in AI replacing human expertise and judgment, but we’re seeing more and more ways that AI can amplify it in responsible and accurate ways.”

With trust in news organizations continuing to decline, we need to make it easier for audiences to get to know the journalists behind the stories. Establishing trust with readers means humanizing the way we present our work. Only 21% of Americans say they’ve ever spoken to a local journalist, according to a Pew Research Center survey. When print was the primary way people got their news, this distance from journalists was less of an issue than it is today, when misinformation is rampant and it’s more difficult to identify reputable news sources. These problems have only deepened with the proliferation of generative AI.

The Washington Post showed the humanity and care its journalists put into producing “Terror on Repeat,” a powerful project on the devastation caused by AR-15 shootings. Editor-in-chief Sally Buzbee wrote two editors’ notes explaining why they published disturbing content to illuminate the reality of these shootings, and why they showed the impact of bullets from an AR-15 on the human body. These notes described the reporting process for the project, including filing 13 public information requests and scrutinizing nearly 100 autopsy reports from five mass shootings. Buzbee also explained the editorial decision-making process for choosing to publish the images. There were more than 2,000 comments between these two notes, many of them from readers who appreciated this transparency.

How people get their news has splintered, and so media organizations need to provide plenty of ways to get to know their journalists. At the Times, we worked across the newsroom and Opinion to roll out enhanced bylines and bios that provide more context about writers than traditional author pages. The enhanced bios are conversational, written in the first person, and include sections on reporters’ backgrounds, what they cover and their journalistic ethics. Similarly, reporters’ bylines include information about where they reported from. This was a change we implemented after finding that most readers don’t understand traditional datelines. A chatbot is unlikely to interview sources on the ground in Jerusalem or Gaza City, so this detail offers important context for readers.

There are creative ways to introduce the people behind the journalism. CNN has a newsletter, Inside CNN, that features Q&As with individual employees. And it’s not limited to traditional reporters and editors. A recent edition featured Kendall Trammell, a senior producer who leads weekend programming for CNN Digital. The questions are designed to inform readers about both the journalist’s day-to-day work and who they are outside of the newsroom: “What is a common misconception people have about your job?” “How do you manage the stress?” “What is something that people don’t know about you?”

To build trust, it’s important to give readers a view into how decisions are made in newsrooms. When The New York Times Book Review published its list of 10 Best Books of 2023, there were many different ways for book lovers to learn about how the titles were selected. A TikTok video featured Book Review editor Gilbert Cruz and his colleagues talking about their favorites. That vertical video appeared on the Times’ homepage, right next to the list itself. The Times’ flagship newsletter, The Morning, sent a weekend edition that highlighted the debates and research that went into picking the best books. For those who prefer audio, the Book Review podcast included conversations with the editors. This is a change from the historically opaque selection process for end-of-year lists. It’s reminiscent of Wirecutter, which has long been transparent about its exhaustive testing, highlighting the value of having humans make recommendations rather than algorithms.

Trusting News is an excellent resource for examples of how local news organizations have been transparent about how they make decisions:

  • A Kansas City Star journalist used TikTok to explain why reporters use terms like “allegedly” and “accused of” in crime reporting.
  • After getting pushback from readers about its Queer LA project, an LAist reporter explained the complex history of the word and how LAist uses the term.
  • The Texas Tribune published an election guide that explained how they choose which races to cover, how readers inform their work, and how they hold politicians accountable.

Artificial intelligence isn’t going anywhere. As I typed this prediction, Google Docs kept offering to help me write it using AI. The solution is not to prepare for a binary world of humans vs. machines. Rather than despair, we need to find ways to highlight the people and processes behind our journalism.

Rubina Madan Fillion is the director of strategy for The New York Times’ Opinion section.

I spent a decent amount of time conversing with chatbots this year, trying to gauge how effective they are at writing and editing. Their headlines were boring. Their social posts sounded like they were written by overcaffeinated marketers. Their copy was predictable, or it was inaccurate. They certainly couldn’t gain the trust of sources or come up with great story ideas. In short, most reporters don’t have to fear for their jobs anytime soon. Nor do editors.

That isn’t to say we shouldn’t use artificial intelligence in journalism. I wrote a Nieman Lab prediction six years ago about how Al could personalize news, make school board meetings more accessible, and free up journalists’ time to work on more complex stories. Earlier this year, The New York Times was one of the news organizations that announced it was exploring ways to use AI to support our journalists and grow our audience. “What I find exciting is how we can use AI to make the people and processes behind our work more available and accessible to more people. That’s at the heart of our work,” said Alex Hardiman, chief product officer at the Times. “I agree that we’re wholly uninterested in AI replacing human expertise and judgment, but we’re seeing more and more ways that AI can amplify it in responsible and accurate ways.”

With trust in news organizations continuing to decline, we need to make it easier for audiences to get to know the journalists behind the stories. Establishing trust with readers means humanizing the way we present our work. Only 21% of Americans say they’ve ever spoken to a local journalist, according to a Pew Research Center survey. When print was the primary way people got their news, this distance from journalists was less of an issue than it is today, when misinformation is rampant and it’s more difficult to identify reputable news sources. These problems have only deepened with the proliferation of generative AI.

The Washington Post showed the humanity and care its journalists put into producing “Terror on Repeat,” a powerful project on the devastation caused by AR-15 shootings. Editor-in-chief Sally Buzbee wrote two editors’ notes explaining why they published disturbing content to illuminate the reality of these shootings, and why they showed the impact of bullets from an AR-15 on the human body. These notes described the reporting process for the project, including filing 13 public information requests and scrutinizing nearly 100 autopsy reports from five mass shootings. Buzbee also explained the editorial decision-making process for choosing to publish the images. There were more than 2,000 comments between these two notes, many of them from readers who appreciated this transparency.

How people get their news has splintered, and so media organizations need to provide plenty of ways to get to know their journalists. At the Times, we worked across the newsroom and Opinion to roll out enhanced bylines and bios that provide more context about writers than traditional author pages. The enhanced bios are conversational, written in the first person, and include sections on reporters’ backgrounds, what they cover and their journalistic ethics. Similarly, reporters’ bylines include information about where they reported from. This was a change we implemented after finding that most readers don’t understand traditional datelines. A chatbot is unlikely to interview sources on the ground in Jerusalem or Gaza City, so this detail offers important context for readers.

There are creative ways to introduce the people behind the journalism. CNN has a newsletter, Inside CNN, that features Q&As with individual employees. And it’s not limited to traditional reporters and editors. A recent edition featured Kendall Trammell, a senior producer who leads weekend programming for CNN Digital. The questions are designed to inform readers about both the journalist’s day-to-day work and who they are outside of the newsroom: “What is a common misconception people have about your job?” “How do you manage the stress?” “What is something that people don’t know about you?”

To build trust, it’s important to give readers a view into how decisions are made in newsrooms. When The New York Times Book Review published its list of 10 Best Books of 2023, there were many different ways for book lovers to learn about how the titles were selected. A TikTok video featured Book Review editor Gilbert Cruz and his colleagues talking about their favorites. That vertical video appeared on the Times’ homepage, right next to the list itself. The Times’ flagship newsletter, The Morning, sent a weekend edition that highlighted the debates and research that went into picking the best books. For those who prefer audio, the Book Review podcast included conversations with the editors. This is a change from the historically opaque selection process for end-of-year lists. It’s reminiscent of Wirecutter, which has long been transparent about its exhaustive testing, highlighting the value of having humans make recommendations rather than algorithms.

Trusting News is an excellent resource for examples of how local news organizations have been transparent about how they make decisions:

  • A Kansas City Star journalist used TikTok to explain why reporters use terms like “allegedly” and “accused of” in crime reporting.
  • After getting pushback from readers about its Queer LA project, an LAist reporter explained the complex history of the word and how LAist uses the term.
  • The Texas Tribune published an election guide that explained how they choose which races to cover, how readers inform their work, and how they hold politicians accountable.

Artificial intelligence isn’t going anywhere. As I typed this prediction, Google Docs kept offering to help me write it using AI. The solution is not to prepare for a binary world of humans vs. machines. Rather than despair, we need to find ways to highlight the people and processes behind our journalism.

Rubina Madan Fillion is the director of strategy for The New York Times’ Opinion section.