Looking over the scope of the Knight News Challenge, from its beginning to the winners announced this year, it’s clear data is king. From this year’s data-mining projects alone — whether inside the confines of the newsroom, posting public records in rural areas, or delivering vital information on clean water — we can safely say that the Knight Foundation sees data as a big part of the future for journalism and communities.
But Francis Irving says we’ve only scratched the surface on how data is delivered, displayed and consumed. “It’s an unexplored area,” said Irving, CEO of ScraperWiki. “We’re right at the beginning of this.”
As you may have guessed from the name, ScraperWiki is a tool to help people collect and publish data through a simple online interface that also serves as a repository for code. (Equal dose scraper and wiki.)
As a winner of this year’s Knight News Challenge, ScraperWiki plans to use their three-year, $280,000 grant to expand both their product and their reach. With a home base in Liverpool, ScraperWiki also hopes to cross the Atlantic and replicate their work helping journalists and ordinary citizens uncover data. “We want to lower the barrier for someone to do general purpose programming,” he said.
Irving told me a number of reporters and programmers in the U.K. have teamed up to use ScraperWiki to find stories and give new life to old data. James Ball, an investigative reporter for The Guardian, used ScraperWiki to write a story on the influence and spending of lobbyist on members of Parliament. ScraperWiki was also used by U.K. officials to create a search site for services provided by the government.
One of the reasons for ScraperWiki’s success, Irving said, is the meetups they throw to bring journalists and programmers face to face. Part of their expansion plans under the News Challenge grant is launching similar, Hacks/Hackers-style events here in the U.S., which will also serve as an introduction to ScraperWiki. Irving said the meetups are less about serving up punch and pie, but instead a way of fostering the kind of talk that happens when you bring different perspectives to talk about a shared interest.
“The value is in gathering, structuring, and building things people haven’t thought of yet,” he said.
More broadly, they plan to build out a new set of features for ScraperWiki, including an embargo tool that would allow journalists to create structured datasets that would be released on publication of a story; an on-demand tool for a seamless process for finding and releasing records; and alerts which could signal journalists on changes related to databases they follow.
And that gets to Irving’s larger hopes for uses of data, either in storytelling or surfacing vital information for the public’s use. Data journalism, he said, can serve a great purpose, but has to expand beyond simply accessing and assessing government records for stories. That’s why Irving is interested in the new generation of news apps that step outside of the article or investigative series, that take a different approach to visualization and display.
Irving said they’re happy to have a partner like Knight who has knowledge and connections in the world of journalism, which will be a help when ScraperWiki comes to these shores. The key ingredient, he said, is partnering the creative expertise of programmers, who can see new angles for code, and journalists, who can curate what’s important to the community.
“There’s going to be lots of things happening when you combine professional journalists with computer programs and they can supercharge each other,” he said.