Editor’s Note: The Nieman Foundation turns 75 years old this year, and our longevity has helped us to accumulate one of the most thorough collections of books about the last century of journalism. We at Nieman Lab are taking our annual late-summer break — expect limited posting between now and August 19 — but we thought we’d leave you readers with some interesting excerpts from our collection.
These books about journalism might be decades old, but in a lot of cases, they’re dealing with the same issues journalists are today: how to sustain a news organization, how to remain relevant, and how a vigorous press can help a democracy. This is Summer Reading 2013.
In an attempt to be both comprehensive and pragmatic, Meyer covers the “history of journalism in the scientific tradition; elements and techniques of data analysis; the use of statistics, computers, surveys and field experiments; database applications; election surveys; and the politics of precision journalism.” Some of these techniques and technologies, new then, sounds obvious today; for example, Knight Ridder’s VU/TEXT service, which allowed you to use multiple search terms to comb for useful articles in a database.
But what’s most prescient about Meyer’s manual is not the technological advances, or his admonition to journalism students to pick up these tools as quickly as possible. After two decades of thinking about social science journalism, Meyer offers advice on how to incorporate these skills into traditional reporting in a way that’s highly relevant to how we talk today about incorporating data journalism into the newsroom, and helping everyone from developers to designers think and build as storytellers. Here’s an example:
Bill Dedman of the Atlanta Constitution won the 1989 Pulitzer Prize for investigative reporting with an overlay of census figures on race and federally mandated bank reports on home loans. The guts of his series was in a single quantitative comparison: the rate of loans was five times as high for middle-income white neighborhoods as it was for carefully matched middle-income black neighborhoods.
One number does not a story make, and Dedman backed up the finding with plenty of old-fashioned leg work. His stories provided a good mix of general data and specific examples, such as the fluent black educator who had to try three banks and endure insulting remarks about his neighborhood before getting his home improvement loan. One of the most telling illustrations was a pair of maps of the Atlanta metropolitan area. One showed the areas that were 50 percent black or more in the 1980 census. The other showed the areas where fewer than 10 percent of owner occupied homes were financed with loans from banks or savings and loan associations. The two patterns were a near perfect match.
Dedman had help. In evaluating the evidence of racial prejudice on the part of banks, he followed a methodological trail that had been established by university researchers. Dwight Morris, the assistant managing editor for special projects, supervised the computer analysis. No complicated mainframe or sophisticated statistical analysis package was needed. The job was done with Framework, an Ashton-Tate product that integrates basic word processing, database management, spreadsheet, communication, and graphics software.
There are a lot of good, complicated stories behind simple numbers. The trick is to identify the number that will tell the story and then go find it. The new tools for manipulating data in public records should make it easier for journalists to find and reveal such light-giving numbers.