Nieman Foundation at Harvard
HOME
          
LATEST STORY
With Hurricane Milton looming, NPR stations got a lower-bandwidth way to reach residents
ABOUT                    SUBSCRIBE
June 13, 2017, 6:41 a.m.
Audience & Social

The New York Times, with a little help from automation, is aiming to open up most articles to comments

Previously, 10 percent of the Times’ articles were open to comments. With a tool called Perspective, from Google parent Alphabet’s tech incubator, to help evaluate comments in bulk, it’s shooting for 80 percent open to comments by the end of this year.

The New York Times’ strategy for taming reader comments has for many years been laborious hand curation. Its community desk of moderators examines around 11,000 individual comments each day, across the 10 percent of total published articles that are open to commenting.

For the past few months, the Times has been testing a new tool from Jigsaw — Google parent Alphabet’s tech incubator — that can automate a chunk of the arduous moderation process. On Tuesday, the Times will begin to expand the number of articles open for commenting, opening about a quarter of stories on Tuesday and shooting for 80 percent by the end of this year. (Another partner, Instrument, built the CMS for moderation.)

“The bottom line on this is that the strategy on our end of moderating just about every comment by hand, and then using that process to show readers what kinds of content we’re looking for, has run its course,” Bassey Etim, Times community editor, told me. “From our end, we’ve seen that it’s working to scale comments — to the point where you can have a good large comments section that you’re also moderating very quickly, things that are widely regarded as impossible. But we’ve got a lot left to go.”

These efforts to improve its commenting functions were highlighted in the Times announcement earlier this month about the creation of a reader center, led by Times editor Hanna Ingber, to deal specifically with reader concerns and insights. (In the same announcement, it let go Liz Spayd and eliminated its public editor position.)

Nudging readers towards comments that the Times “is looking for” is no easy task. Its own guidelines, laid out in an internal document and outlining various rules around comments and how to take action on them, have evolved over time. (I took the Times’ moderation quiz — getting only one “correct” — and at my pace, it would’ve taken more than 24 hours to finish tagging 11,000 comments.)

Jigsaw’s tool, called Perspective, has been fed a corpus of Times comments that have been tagged by human editors already. Human editors then trained the algorithm over the testing phase, flagging mistakes in moderation it made. In the new system, a moderator can evaluate comments based on the likelihood of rejection and checks that the algorithm has properly labeled comments that fall into a grayer zone (comments with 17 to 20 percent likelihood of rejection, for instance). Then the community desk team can set a rule to allow all comments that fall between 0 to 20 percent, for instance, to go through.

“We’re looking at an extract of all the mistakes it’s made, evaluate what the impact of each of those moderating mistakes might be on the community and on the perceptions of our product. Then based on that, we can choose different forms of moderation for each individual section at the Times,” Etim said. Some sections could remain entirely human-moderated; some sections that tend to have a low rate of rejection for comments could be automated.

Etim’s team will be working closely with Ingber’s Reader Center, “helping out in terms of staffing projects, with advice, and all kinds of things,” though the relationship and roles are not currently codified.

“It used to be when something bubbled up in the comments, maybe we’d hear repeated comments or concerns about coverage. You’d send that off to a desk editor, and they would say, ‘That’s a good point; let’s deal with this.’ But the reporter is out reporting something else, then time expires, and it passes,” Etim said. “Now it’s at the point where when things bubble up, [Ingber] can help us take care of it in the highest levels in the newsroom.”

I asked Etim why the Times hadn’t adopted any of the Coral Project’s new tools around comment moderation, given that Coral was announced years ago as a large collaborative effort between The Washington Post, the Times, and Mozilla. It’s mostly a matter of immediate priorities, according to Etim, and he can see the Times coming back to the Coral Project’s tools down the line.

“The Coral Project is just working on a different problem set at the moment — and the Coral Project was never meant to be creating the New York Times commenting system,” he said. “They are focusing on helping most publishers on the web. Our business priority was, how do we do moderation at scale? And for moderation at our kind of scale, we needed the automation.

“The Coral stuff became a bit secondary, but we’re going to circle back and look at what it has in the open source world, and looking to them as a model for how to deal with things like user reputation,” he added.

Photo of chattering teeth toy by Wendy used under a Creative Commons license.

POSTED     June 13, 2017, 6:41 a.m.
SEE MORE ON Audience & Social
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
With Hurricane Milton looming, NPR stations got a lower-bandwidth way to reach residents
In normal times, text-only websites are a niche interest. But a natural disaster is not normal times.
How a 19th-century news revolution sparked activists, influencers, disinformation, and the Civil War
Long before anyone was accused of being “woke,” the Wide Awakes used new news technology to rapidly construct a national movement.
How The New York Times incorporates editorial judgment in algorithms to curate its home page
The Times’ algorithmic recommendations team on responding to reader feedback, newsroom concerns, and technical hurdles.