Nieman Foundation at Harvard
HOME
          
LATEST STORY
Do article tags matter? Maybe not for traffic, but publishers are using them to glean insights
ABOUT                    SUBSCRIBE
June 27, 2011, 1:30 p.m.

Deeper into data: U.K.-based ScraperWiki plans new tools and U.S. expansion with News Challenge

Looking over the scope of the Knight News Challenge, from its beginning to the winners announced this year, it’s clear data is king. From this year’s data-mining projects alone — whether inside the confines of the newsroom, posting public records in rural areas, or delivering vital information on clean water — we can safely say that the Knight Foundation sees data as a big part of the future for journalism and communities.

But Francis Irving says we’ve only scratched the surface on how data is delivered, displayed and consumed. “It’s an unexplored area,” said Irving, CEO of ScraperWiki. “We’re right at the beginning of this.”

As you may have guessed from the name, ScraperWiki is a tool to help people collect and publish data through a simple online interface that also serves as a repository for code. (Equal dose scraper and wiki.)

As a winner of this year’s Knight News Challenge, ScraperWiki plans to use their three-year, $280,000 grant to expand both their product and their reach. With a home base in Liverpool, ScraperWiki also hopes to cross the Atlantic and replicate their work helping journalists and ordinary citizens uncover data. “We want to lower the barrier for someone to do general purpose programming,” he said.

Irving told me a number of reporters and programmers in the U.K. have teamed up to use ScraperWiki to find stories and give new life to old data. James Ball, an investigative reporter for The Guardian, used ScraperWiki to write a story on the influence and spending of lobbyist on members of Parliament. ScraperWiki was also used by U.K. officials to create a search site for services provided by the government.

One of the reasons for ScraperWiki’s success, Irving said, is the meetups they throw to bring journalists and programmers face to face. Part of their expansion plans under the News Challenge grant is launching similar, Hacks/Hackers-style events here in the U.S., which will also serve as an introduction to ScraperWiki. Irving said the meetups are less about serving up punch and pie, but instead a way of fostering the kind of talk that happens when you bring different perspectives to talk about a shared interest.

“The value is in gathering, structuring, and building things people haven’t thought of yet,” he said.

More broadly, they plan to build out a new set of features for ScraperWiki, including an embargo tool that would allow journalists to create structured datasets that would be released on publication of a story; an on-demand tool for a seamless process for finding and releasing records; and alerts which could signal journalists on changes related to databases they follow.

And that gets to Irving’s larger hopes for uses of data, either in storytelling or surfacing vital information for the public’s use. Data journalism, he said, can serve a great purpose, but has to expand beyond simply accessing and assessing government records for stories. That’s why Irving is interested in the new generation of news apps that step outside of the article or investigative series, that take a different approach to visualization and display.

Irving said they’re happy to have a partner like Knight who has knowledge and connections in the world of journalism, which will be a help when ScraperWiki comes to these shores. The key ingredient, he said, is partnering the creative expertise of programmers, who can see new angles for code, and journalists, who can curate what’s important to the community.

“There’s going to be lots of things happening when you combine professional journalists with computer programs and they can supercharge each other,” he said.

POSTED     June 27, 2011, 1:30 p.m.
PART OF A SERIES     Knight News Challenge 2011
SHARE THIS STORY
   
Show comments  
Show tags
 
Join the 15,000 who get the freshest future-of-journalism news in our daily email.
Do article tags matter? Maybe not for traffic, but publishers are using them to glean insights
Analytics company Parse.ly found that sites are expanding their use of article tags to track sponsored content and control paywall access.
It’s time to apply for a visiting Nieman Fellowship
The Nieman Foundation for Journalism at Harvard wants to hear your idea for making journalism better. Come spend a few weeks working on it in Cambridge. Deadline: October 31.
From Nieman Reports: From earnings reports to baseball recaps, automation and algorithms are becoming a bigger part of the news
“Let’s have a computer do what a computer’s good at, and let’s have a human do what a human’s good at.”
What to read next
2569
tweets
The New York Times built a Slack bot to help decide which stories to post to social media
The bot, named Blossom, helps predict how stories will do on social and also suggests which stories editors should promote.
1287Jo Ellen Green Kaiser: Do independent news outlets have a blind spot when it comes to ethnic media?
The head of the Media Consortium argues that, by defining themselves in opposition to mainstream media, independent progressive outlets miss out on the power of ethnic and community journalism.
1029Newsonomics: 10 numbers on The New York Times’ 1 million digital-subscriber milestone
Digital subscribers are proving to be the bedrock of the Times’ business model going forward. How much more room is there for growth — and at what price points?
These stories are our most popular on Twitter over the past 30 days.
See all our most recent pieces ➚
Encyclo is our encyclopedia of the future of news, chronicling the key players in journalism’s evolution.
Here are a few of the entries you’ll find in Encyclo.   Get the full Encyclo ➚
GlobalPost
The Dish
FiveThirtyEight
The UpTake
BBC News
The Atlantic
Lens
Newsweek
Plaza Pública
MinnPost
ABC News
OpenFile