Hopefully you know about PastPages, the tool built by L.A. Times data journalist Ben Welsh to record what some of the web’s most important news sites have on their homepage — hour by hour, every single day. Want to see what The Guardian’s homepage looked like Tuesday night? Here you go. Want to see how that Ebola patient first appeared on DallasNews.com in September? Try the small item here. It’s a valuable service, particularly for future researchers who will want to study how stories moved through new media. (For print media, we have physical archives; for digital news, work even a few years old has an alarming tendency to disappear.)
Anyway, Ben is back with a new tool called StoryTracker, “a set of open source tools for archiving and analyzing news homepages,” backed in part by the Reynolds Journalism Institute at Mizzou.
It offers a menu of options, documented here, for creating an orderly archive of HTML snapshots, extracting hyperlinks with a bonus set of metadata that captures each link’s prominence on the page and visualizing a page’s layout with animations that show changes over time.
The potential uses for researchers are obvious, but I could also imagine plenty of realtime uses. Tracking your own homepage over time, you could get good data on how the granular movement of stories there correlates with traffic over time. (To ask questions like: Is the top slot more or less valuable on weekends or overnight than during the day Monday to Friday?) You could track your competition’s homepages to get hard data on what stories they’re pushing hardest. And unlike the base PastPages, which saves screenshots of homepages, StoryTracker gets at the HTML to determine what stories are where. It’s all open source, so have at it. (Here’s a sample analysis to see what sources the Drudge Report links to most.)
Ben presented StoryTracker at a conference at RJI earlier this week; here’s the video and his slide deck.