Nieman Foundation at Harvard
HOME
          
LATEST STORY
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
ABOUT                    SUBSCRIBE
Aug. 8, 2013, 11 a.m.

Summer Reading 2013: “The New Precision Journalism” by Philip Meyer (1991)

“There are a lot of good, complicated stories behind simple numbers. The trick is to identify the number that will tell the story and then go find it.”

Editor’s Note: The Nieman Foundation turns 75 years old this year, and our longevity has helped us to accumulate one of the most thorough collections of books about the last century of journalism. We at Nieman Lab are taking our annual late-summer break — expect limited posting between now and August 19 — but we thought we’d leave you readers with some interesting excerpts from our collection.

These books about journalism might be decades old, but in a lot of cases, they’re dealing with the same issues journalists are today: how to sustain a news organization, how to remain relevant, and how a vigorous press can help a democracy. This is Summer Reading 2013.

The New Precision Journalism by Philip Meyer

Google Books
Amazon

The idea of “precision journalism” first surfaced in 1969, when former Nieman Fellow Philip Meyer wrote his classic book on the art of melding social science practice with traditional journalism. Over the next twenty years, as technologies developed, Meyer taught journalism students how to build stories from public records. By 1991, the tools for analysis had advanced to the point that Meyer saw fit to write a new bookThe New Precision Journalism.

In an attempt to be both comprehensive and pragmatic, Meyer covers the “history of journalism in the scientific tradition; elements and techniques of data analysis; the use of statistics, computers, surveys and field experiments; database applications; election surveys; and the politics of precision journalism.” Some of these techniques and technologies, new then, sounds obvious today; for example, Knight Ridder’s VU/TEXT service, which allowed you to use multiple search terms to comb for useful articles in a database.

But what’s most prescient about Meyer’s manual is not the technological advances, or his admonition to journalism students to pick up these tools as quickly as possible. After two decades of thinking about social science journalism, Meyer offers advice on how to incorporate these skills into traditional reporting in a way that’s highly relevant to how we talk today about incorporating data journalism into the newsroom, and helping everyone from developers to designers think and build as storytellers. Here’s an example:

Bill Dedman of the Atlanta Constitution won the 1989 Pulitzer Prize for investigative reporting with an overlay of census figures on race and federally mandated bank reports on home loans. The guts of his series was in a single quantitative comparison: the rate of loans was five times as high for middle-income white neighborhoods as it was for carefully matched middle-income black neighborhoods.

One number does not a story make, and Dedman backed up the finding with plenty of old-fashioned leg work. His stories provided a good mix of general data and specific examples, such as the fluent black educator who had to try three banks and endure insulting remarks about his neighborhood before getting his home improvement loan. One of the most telling illustrations was a pair of maps of the Atlanta metropolitan area. One showed the areas that were 50 percent black or more in the 1980 census. The other showed the areas where fewer than 10 percent of owner occupied homes were financed with loans from banks or savings and loan associations. The two patterns were a near perfect match.

Dedman had help. In evaluating the evidence of racial prejudice on the part of banks, he followed a methodological trail that had been established by university researchers. Dwight Morris, the assistant managing editor for special projects, supervised the computer analysis. No complicated mainframe or sophisticated statistical analysis package was needed. The job was done with Framework, an Ashton-Tate product that integrates basic word processing, database management, spreadsheet, communication, and graphics software.

There are a lot of good, complicated stories behind simple numbers. The trick is to identify the number that will tell the story and then go find it. The new tools for manipulating data in public records should make it easier for journalists to find and reveal such light-giving numbers.

POSTED     Aug. 8, 2013, 11 a.m.
PART OF A SERIES     Summer Reading 2013
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
Within days of visiting the pages — and without commenting on, liking, or following any of the material — Facebook’s algorithm recommended reams of other AI-generated content.
What journalists and independent creators can learn from each other
“The question is not about the topics but how you approach the topics.”
Deepfake detection improves when using algorithms that are more aware of demographic diversity
“Our research addresses deepfake detection algorithms’ fairness, rather than just attempting to balance the data. It offers a new approach to algorithm design that considers demographic fairness as a core aspect.”