Nieman Foundation at Harvard
HOME
          
LATEST STORY
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
ABOUT                    SUBSCRIBE
Sept. 21, 2017, 9:59 a.m.
Reporting & Production

What are the most useful ways to bring artificial intelligence into newsrooms? How can journalists use it in their reporting process? Is it going to replace newsroom jobs?

A report out this week from the Tow Center for Digital Journalism looks at how AI can be adapted to journalism. It summarizes a previously off-the-record meeting held back in June by the Tow Center and the Brown Institute for Media Innovation. (Full disclosure: Nieman Lab director Joshua Benton was part of the meeting.)

Among the report’s findings:

AI “helps reporters find and tell stories that were previously out of reach or impractical.” Three areas where AI can be particularly helpful in the newsroom: “Finding needles in haystacks” (discovering things in data that humans can’t, which humans can then fact-check); identifying trends or outliers; and as a subject of a story itself: “Because they are built by humans, algorithms harbor human bias — and by examining them, we can discover previously unseen bias.”

AI can deliver much more personalized news — for good and bad. AI could be used to monitor readers’ likes and dislikes, ultimately shaping stories to people’s individual interests. But, as one participant cautioned:

The first stage of personalization is recommending articles; the long-term impact is filter bubbles. The next step is using NLP [Natural Language Processing] to shape an article to exactly the way you want to read it. Tone, political stance, and many other things. At that point, journalism becomes marketing. We need to be very aware that too much personalization crosses the line into a different activity.

Another concern is that, if articles become too personalized, the public record is at risk: “When everyone sees a different version of a story, there is no authoritative version to cite.”

— AI brings up new ethical considerations. Participants agreed that news organizations need to disclose when AI has been used in creating a story, but “that description must be translated into non-technical terms, and told in a concise manner that lets readers understand how AI was used and how choices were made.” There’s a need for best practices around disclosures.

Also, most AI tools aren’t built specifically with newsrooms (or their editorial values) in mind. One engineer said: “A lot of these questions currently seem impenetrable to us engineers because we don’t understand the editorial values at a deep level, so we can’t model them. Engineers don’t necessarily think of the systems they are building as embodying editorial values, which is an interesting problem. The way a system like this is built does not reflect this underlying goal.”

The full report is here.

Photo of robots by Robert Heim used under a Creative Commons license.

Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
“While there is even more need for this intervention than when we began the project, the initiative needs more resources than the current team can provide.”
Is the Texas Tribune an example or an exception? A conversation with Evan Smith about earned income
“I think risk aversion is the thing that’s killing our business right now.”
The California Journalism Preservation Act would do more harm than good. Here’s how the state might better help news
“If there are resources to be put to work, we must ask where those resources should come from, who should receive them, and on what basis they should be distributed.”