Nieman Foundation at Harvard
HOME
          
LATEST STORY
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
ABOUT                    SUBSCRIBE
Dec. 21, 2010, 2 p.m.

#NiemanLeaks big takeaway? Even post-WikiLeaks, context still key

The Nieman Foundation’s Secrecy and Journalism conference last week set out to tackle a lot of questions, but perhaps none were as big as the central one posed to attendees: What should journalism’s role be in this new environment of distributed leakers, massive databases, and citizen reporters.

The answer most of the panels seemed to reach, however, might be a comforting one: Provide the context and texture behind the data, while vetting sources for accuracy and agenda. Not too different from what journalist have always been supposed to do — but now the tools, sources, and audience have come together to allow for a much richer, deeper form of reporting than has ever been possible.

We’ve summed up and posted video and liveblogs from each of the conference sessions. But after sifting through it all, here are my five key takeaways from the discussion.

Data needs context

While Julian Assange initially relied on radical transparency as a tool to spur change, he quickly learned that crafting a narrative around the raw documents produced a much more dramatic result. Even The New York Times’ Bill Keller acknowledged WikiLeaks has “evolved.” The new leak revolution begins looking more and more like the old guard, even as it collaborates with them.

Beware secrecy’s hard liners

The U.S.’s classification system may or may not be broken, as CJR’s Clint Hendler suggested in one panel — but it definitely has quirks, shortcomings, and fallibilities. As Danielle Brian, executive director of the Project on Government Oversight, put it: “It’s important not to take too seriously what the government says is and isn’t classified. It’s a game.”

Vet, vet, vet

Whether dealing with Deep Throat, a whistleblower, or a shadowy international band of hackers, journalists need to look at their sources critically, questioning the source’s agenda as well as ensuring the material is authentic. As Keller noted, The New York Times has treated WikiLeaks as a source, not a partner. Just because the form of the source has changed doesn’t change the fundamental relationship. And as an added warning, note Walter Pincus’ admonition that almost all of the “new” sources that approach him are simply wrong.

WikiLeaks hasn’t (yet) established a new order

With technology — particularly technology under siege — distributed tends to win over centralized, and there are already new organizations popping up all over hoping to take WikiLeaks’ mantle. The more fundamental point, however, is that similar leaks have been driving much of journalism in the United States and around the world for decades — meaning there may be less new and different about WikiLeaks than there is familiar to any good investigative journalist.

The hard work is just beginning

Despite all the opportunities and changes occurring, the basic grunt work of investigative journalism is still boring, tedious, and, particularly at the local level, critical to serving as an effective watchdog for democracy.

POSTED     Dec. 21, 2010, 2 p.m.
PART OF A SERIES     Secrecy and Journalism
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
Within days of visiting the pages — and without commenting on, liking, or following any of the material — Facebook’s algorithm recommended reams of other AI-generated content.
What journalists and independent creators can learn from each other
“The question is not about the topics but how you approach the topics.”
Deepfake detection improves when using algorithms that are more aware of demographic diversity
“Our research addresses deepfake detection algorithms’ fairness, rather than just attempting to balance the data. It offers a new approach to algorithm design that considers demographic fairness as a core aspect.”