Nieman Foundation at Harvard
HOME
          
LATEST STORY
The Appeal focuses on an often undercovered aspect of criminal justice: local prosecutors
ABOUT                    SUBSCRIBE
Sept. 21, 2017, 9:59 a.m.
Reporting & Production

What are the most useful ways to bring artificial intelligence into newsrooms? How can journalists use it in their reporting process? Is it going to replace newsroom jobs?

A report out this week from the Tow Center for Digital Journalism looks at how AI can be adapted to journalism. It summarizes a previously off-the-record meeting held back in June by the Tow Center and the Brown Institute for Media Innovation. (Full disclosure: Nieman Lab director Joshua Benton was part of the meeting.)

Among the report’s findings:

AI “helps reporters find and tell stories that were previously out of reach or impractical.” Three areas where AI can be particularly helpful in the newsroom: “Finding needles in haystacks” (discovering things in data that humans can’t, which humans can then fact-check); identifying trends or outliers; and as a subject of a story itself: “Because they are built by humans, algorithms harbor human bias — and by examining them, we can discover previously unseen bias.”

AI can deliver much more personalized news — for good and bad. AI could be used to monitor readers’ likes and dislikes, ultimately shaping stories to people’s individual interests. But, as one participant cautioned:

The first stage of personalization is recommending articles; the long-term impact is filter bubbles. The next step is using NLP [Natural Language Processing] to shape an article to exactly the way you want to read it. Tone, political stance, and many other things. At that point, journalism becomes marketing. We need to be very aware that too much personalization crosses the line into a different activity.

Another concern is that, if articles become too personalized, the public record is at risk: “When everyone sees a different version of a story, there is no authoritative version to cite.”

— AI brings up new ethical considerations. Participants agreed that news organizations need to disclose when AI has been used in creating a story, but “that description must be translated into non-technical terms, and told in a concise manner that lets readers understand how AI was used and how choices were made.” There’s a need for best practices around disclosures.

Also, most AI tools aren’t built specifically with newsrooms (or their editorial values) in mind. One engineer said: “A lot of these questions currently seem impenetrable to us engineers because we don’t understand the editorial values at a deep level, so we can’t model them. Engineers don’t necessarily think of the systems they are building as embodying editorial values, which is an interesting problem. The way a system like this is built does not reflect this underlying goal.”

The full report is here.

Photo of robots by Robert Heim used under a Creative Commons license.

Show tags Show comments / Leave a comment
 
Join the 45,000 who get the freshest future-of-journalism news in our daily email.
The Appeal focuses on an often undercovered aspect of criminal justice: local prosecutors
The site, recently rebranded from In Justice Today, wants to shine a light on a more mysterious part of the legal system by focusing on local prosecutors and criminal justice policy.
These are the three types of bias that explain all the fake news, pseudoscience, and other junk in your News Feed
Indiana University researchers “have found that steep competition for users’ limited attention means that some ideas go viral despite their low quality — even when people prefer to share high-quality content.”
Newsonomics: GateHouse’s Mike Reed talks about rolling up America’s news industry
“Content is our number-one priority,” Reed said. But he’s unwilling to publicly commit to any new level of funding or staffing to meet that goal.