To mark the 100th anniversary of the start of World War I, The Guardian today launched a massive 32-minute interactive documentary that illustrates the history of the war.
The Guardian’s aim with the interactive is to tell the story of the war from a global perspective. To achieve that, the documentary includes 10 historians from all around the world discussing the war and its impact from their country’s perspective. In a blog post on The Guardian’s website, special projects editor Francesca Panetta explained how her team set out to get a global perspective on World War I:
I also sourced letters, diary entries and poems from around the world. We’re familiar here in the UK with war poets Wilfred Owen and Siegfried Sassoon, but I thought there must be Turkish and Indian equivalents. With the help of Modern Poetry in Translation we found poems from all over the world. Of course, cutting all this together is itself editorialising. But as much as possible we wanted this global story to be told through the words of participators rather than a posh, English, scripted voiceover!
To further that global approach, The Guardian, through a partnership with the British Academy, translated the interactive into six additional languages aside from English — French, German, Italian, Spanish, Arabic, and Hindi. (The videos are all narrated in English, but have different subtitles depending on what language the user selects.) Since 2011, the British Academy, which supports the humanities and social sciences in the U.K., has funded a program with The Guardian to advocate for improved language education. That program, the Case for Language Learning, paid for professional translations.
Kiln, the London-based interactive design firm that built the interactive, “designed a clever mechanism that displayed the subtitles and an international team of Guardian journalists checked them for accuracy,” Panetta wrote in her blog post. “We had all languages skills in-house, except Hindi!” The Guardian is looking to translate the interactive into even more languages by asking readers if they’d like offer their own translation services for additional languages. Other news organizations have also offered to help with translations, Panetta said.
“A Scandinavian paper has already approached us saying they may be interested in collaborating in this way,” she told me in an email. “It depends on the response we get, but we would very much like to relaunch in the autumn with more languages. That’s the plan.”
Here’s an interesting project from the data-oriented software developer Shoothill: GaugeMap is an interactive map with live river-level data from over 2,400 government gauges across England and Wales. From the announcement:
GaugeMap aims to help to look after and improve the natural environment by allowing these users to access this data on the move, wherever they are. Users can retrieve live data on actual river levels via the website, or by following the new, dedicated Twitter accounts that GaugeMap has established for each of the Environment Agency‘s 2,400+ river level monitoring stations they may be interested in. For example Teddington Lock now has its own Twitter account: https://twitter.com/riverlevel_1182.
“GaugeMap will help any river user to be better informed, whether they use the river for recreation, pleasure or business,” said Rod Plummer, MD at Shoothill. “It also provides accurate, up-to-date information to help with water abstraction and so it could potentially be used to ensure the amount of water being abstracted from any river at any given time is sustainable and acceptable. Over-abstraction of river systems can cause changes in water quality, which obviously can have wide-reaching impacts on the wildlife that relies on our natural waterways, both directly and indirectly.”
It’s the Twitter integration that most interests me — over 2,400 accounts, each tied to a specific spot on a specific river, sending out alerts about water levels:
One could imagine ways to improve the bots. For instance, the accounts don’t seem to be smart enough to automatically alert when the water gets dangerously high. The GaugeMap site tells me that Catcliffe Drain is in a “Flooding Possible” state, but you couldn’t tell that from Catcliffe Drain’s Twitter account.
Still, the idea is powerful: a kind of distributed EveryBlock. One could imagine a local news organization gathering together data like this and pushing it out through neighborhood focused social media accounts, automatically and without human intervention.
The surprising thing to me – and which I believe is unusual for newspaper paywalls – is that the N&R is charging more for a digital subscription than for a print subscription.
Currently, a 7-day, 52-week subscription costs $187.12. According to an ad in the newspaper today, the digital subscription is $215.40. (FYI, the subscription page on the website hasn’t been updated, at least that I can find.)
In comparison, the News & Observer charges $390 for a year’s print subscription, and only $69.95 for a digital subscription. The Star News in Wilmington charges $218.40 for the print edition, and $131.40 for a digital subscription.
But the N&R is cutting the other way. Editor/Publisher Jeff Gauger explains: “The reason for that variance? A print subscription permits us to subsidize the cost of content by providing access to your home or business for preprinted advertising circulars. A digital-only subscription lacks that advertising subsidy.”
The N&R’s move is unusual, but it’s far from unprecedented. The Orange County Register is currently offering digital access for $3.99 a week, or digital plus Sunday print for $2.99 a week. That’s right: They’ll essentially pay you a dollar a week to take the Sunday paper. The New York Times has done something similar since launching its paywall — Sunday print gives you all-digital access at a price that’s usually cheaper than all-digital access itself. (Currently: $8.60 a week for Sunday print plus digital, $8.75 a week for just digital.)
What is a bit unusual here is pricing digital above a seven-day print subscription, not just a Sunday print subscription. That is unusual. But Berkshire Hathaway-owned papers have gone against the grain before. The Omaha World-Herald charges even 7-day print subscribers for digital access ($7 extra a month!) and $25 a month for digital alone. The Tulsa World offers $14.99 a month for digital — or $14 a month for digital plus Sunday and Wednesday print. It’s a weird world.
Is the pricing structured to encourage digital users to subscribe to the paper? After all, the more subscribers a paper has, the more it can charge advertisers. (Despite what many readers think, advertising pays the bulk of the cost of a newspaper, not subscription fees.) I doubt this is the actual intent, but it does make some perverse sense to the consumer. Unless you can get what you need from the website from its 20 free articles per month.
I can’t speak for the News & Record, but at many papers, that is exactly the intent. Propping up print numbers isn’t the only reason to structure offers this way, but it’s a big one.
Two of the biggest trends in news today: the rise of mobile and the rise of data visualization.
The unfortunate reality is that they’re often in conflict. Too many beautiful data visualizations are designed with a big desktop browser window in mind, not the smaller screen of an iPhone or an Android phone. Text becomes unreadable, interactions become untappable, and a lovely experience becomes unusable.
Vertical Bar Charts: When using bar charts in portrait mode, stack your bar chart bars vertically.
Use Vertical Scrolling: When creating interfaces that don’t fit in their entirety on the screen, enable vertical scrolling instead of horizontal scrolling.
Stack Table Cells: When needing to display tables that have more than a couple of columns consider stacking cells vertically within each row.
Carousel Instead of Tabs: When allowing users to switch between different displays, instead of using tabs (which require a lot of horizontal space,) consider using a carousel with next and previous buttons.
Fix Tooltips to Area of Screen: When displaying information on touch, designate an area on screen that will update accordingly.
Use Touch Zones: When displaying a lot of data points that are hoverable/touchable, consider using defined touch zones instead.
A developer from Grist wants to build tools to measure audience in ways outside of pageviews or clicks. Another project aims to create a better tracking system for court records in Massachusetts through a public database. Talkbox, a project from New York Public Radio, would repurpose old phone booths to create a two-way line between the community and the newsroom to help with reader engagement and reporting.
All Prototype Fund grantees receive $35,000 to fund their ideas in the early stages. Each project goes through a prototyping workshop and instruction on human-centered design with the LUMA Institute. After that, teams have six months to work on their project before a demo day. (Obligatory disclaimer: Knight is a funder of Nieman Lab, though not through the Prototype Fund.)
Here’s the full list of 16 projects:
DIY StoryCorps by StoryCorps (Project lead: Dean Haddock): Advancing the mission of StoryCorps, a national program that records, preserves and shares people’s stories, by developing a mobile app that allows anyone to create do-it-yourself interviews.
Do Public Good Button by Public Good Software (Project lead: Dan Ratner): Developing a tool that allows people to take action on important issues through news articles; for example, someone reading an article about drunk driving could click a button and connect with related charities and advocacy groups.
Engagement tools by Grist (Project lead: Chip Geller): Allowing newsrooms to better measure audience engagement, beyond clicks and page views, by creating an open-source WordPress plugin that will measure “attention minutes” to determine how long users are interacting with content.
Facto_Bot (Project lead: Will Knight): Helping prevent misinformation on Twitter by developing software that identifies stories that have been modified, and alerts people who tweeted or retweeted links to these stories that content has changed.
FilmSync App by University of North Carolina (Project lead: Steven King): Creating an app that will connect people who are watching a news story or documentary on television with related content through a second screen app on their smartphones.
Global I-Hub ICIJ (Project lead: Mar Cabra): Making collaboration on cross-border investigative stories easier by providing a secure, easy-to-use platform for reporters to communicate through Facebook-like status updates, threaded communications on specific topics, individual messaging and file sharing.
Market Atlas (Project lead: Jon Gosier): Scaling a data provider network that allows citizens to collect and share microeconomic data from countries in Africa that lack financial infrastructure; providing reliable, consistent financial data should encourage greater investment in the area.
OpenStreetMap Plugin for Open Data Kit by Humanitarian OpenStreetMap Team (Project lead: Kate Chapman): Allowing easier collection of open geographic data, even in places with connectivity issues, by combining Open Data Kit’s data collection with OpenStreetMap’s data community.
PatientsAssemble by PatientsLikeMe (Project lead: Chris Fidyk): Helping people with chronic illnesses interact with policymakers through open-source collaborative tools that will allow users to provide feedback and shape issues that are important to them.
Pilot for School by The Virginian-Pilot (Project lead: Shawn Day): Building a targeted digital system that will allow Virginia teachers to search newspaper content and use it to complement class curriculums; content will align with Virginia’s Standards of Learning and help students apply academic concepts to what’s happening in their community.
Public Database of Massachusetts Court Records by MassINC (Project lead: Steve Koczela): Allowing journalists and the public to better monitor court cases through an online filing and database system for Massachusetts court records.
Public Record (Advanced Emergency Radio Scanner and Repository) (Project lead: Tal Achituv): Creating a tool that will allow journalists to better track current events and investigate past events; with the tool newsrooms can record interactions on police/emergency radios, set alerts and listen to archived content.
QC Tools by Bay Area Video Coalition (Project lead: Carol Varney): Allowing media organizations, journalists and others to easily preserve analog video through an open-source video digitization app that is inexpensive and easy to use.
Talkbox by New York Public Radio (Project lead: Caitlan Thompson): Involving the community in news stories by repurposing phone booths in specific neighborhoods that will provide residents with a direct, two-way line to the New York Public Radio newsroom; a “Talkbox” can help with engaging new audiences or to get information in a neighborhood where a reporting project is taking place.
The Last Graph (Project Lead: Ben Conners): Helping journalists engage with audiences by allowing readers to interact with the final paragraph of a story through a database of “actions” that lead to reader involvement on an issue; for example, the last graph of an article on air pollution could include an action that encourages readers to sign a pledge to use public transit more.
Veritza (Project lead: Djordje Padejski): Helping reporters more easily find story leads from public records through a web platform that allows users to create alerts on information in these records; the platform will do this by scraping and aggregating data and analyzing it for patterns and anomalies.
Just five months after opening up its Data Store — which sells some of the big datasets its reporters produce for stories and projects — ProPublica says it’s generated “well over” $30,000 in new revenue. That figure comes from ProPublica president Richard Tofel in an interview with Southern Methodist University journalism professor Jake Batsell. Since they opened up shop in February, Tofel says more than 500 data sets have been downloaded.
“The 500 downloads, that’s probably more important from a mission standpoint,” Tofel told me. But those who have paid for the premium sets — so far, mostly companies and consultants from the medical industry — may well become repeat customers down the road, when it’s time to update the data. “I think we would consider it a successful experiment in that sense,” he said.
The Data Store serves several purposes for ProPublica, as a means of providing new life to the information they’ve collected and as a potential new source of revenue. That $30,000 is a drop in the bucket compared to the annual grants and contributions ProPublica receives each year. But it’s $30,000 ProPublica didn’t have before, and, as Tofel told me back in February, the broader goal of the store is making more data publicly available.
Good on New York for finding a new audience for an old piece. But I also want to highlight something useful that the BBC is doing to counteract the impression that years-later bursts of social activity can give — that an old news event is happening right now. As sighted by Meg Pickard, formerly of The Guardian (and a few days earlier by this guy):
Not sure how long this notification's been on BBCNews. Should reduce confusion when old stories surface. Context=good pic.twitter.com/o3KDS0db8e
Leaving New York is eternal, and I imagine Chastain’s story will continue to be told for a long time too. In these cases, their newness or oldness isn’t crucial. But that’s not true on all news stories. I’m glad the BBC now has a system for highlighting when a popular news story isn’t necessarily a new one.
In case you spent Friday under a rock, and that rock had no wifi and, like, only one bar of signal, all-world basketball player LeBron James announced he would be returning to his old team, the Cleveland Cavaliers. Breaking the story was Sports Illustrated, which published a first-person essay by James “as told to” SI staffer Lee Jenkins. It was quite a get for SI; at the moment, the story’s been tweeted over 140,000 times.
But Richard Sandomir of The New York Times didn’t care for it much, writing a story ominously headlined “Getting the Scoop, but Not Necessarily the Story; Role of Sports Illustrated in LeBron James’s Announcement Raises Journalistic Questions.”
…armed with the biggest news of the day, the magazine presented it as a 952-word statement on its website from the King, not a full-blown news story with context and breadth…
News value aside, the approach cast Sports Illustrated more as a public-relations ally of James than as the strong journalistic standard-bearer it has been for decades.
And while James’s words may have been all that the sports world wanted to hear, the magazine should have pressed for a story that carried more journalistic heft.
This is crazy. It’s an instance where Sandomir and the Times — who I think are fantastic most of the time, by the way — are fetishizing the business of Serious Journalism at the expense of understanding what sports fans actually care about, appreciating how informed sports fans already are and asserting that the reporter’s highest and best function is to get between fans and the news as opposed to delivering it to them.
Question: what, apart from the name of the team LeBron James chose and his reason for choosing it, do people interested in this story either not know or actually care about? What sort of “journalistic heft” does Sandomir think should have been added to this to “serve the reader” better? Jenkins prefacing the actual news with “James, 29, from Akron, has played for Miami since the 2010-11 season,” would not have added journalistic integrity here. It would have been byline-justifying filler.
Everyone tuning in to this story knows what’s happening. Sports Illustrated and Jenkins provided them with the one thing they didn’t know: where James was going and why. If there is any concern about larger context here, it can and will be addressed by SI sidebars, bullet-pointed, fact-based graphics and, most importantly, an in-depth story from Jenkins about his conversations with James which provides deeper context. All of which, I assume, have either already been published or will soon be.
[James' announcement is] a big event, sure, but at bottom it’s functionally equivalent to a team issuing a statement that it placed a player on the disabled list. That day’s starting lineup. A simple bit of data. A commodity. And just as sports teams and leagues are increasingly bypassing the press in order to release that sort of commodity news directly to fans via their Twitter feeds or in-house news operations, LeBron James could have very easily tweeted that he was heading back to Cleveland to his 13.6 million followers. Or, like he did back in 2010, could’ve said it on some TV show cum P.R. festival he created for himself. Indeed, it’s amazing to me that Sports Illustrated even got what it got here and they should be credited for getting that much. I didn’t need more than that yesterday. I’m more than happy — hell, very, very eager — to wait for Jenkins’ in-depth followup to all of this. I bet it’ll be incredible.
Photo of LeBron James in 2009 by Keith Allison used under a Creative Commons license.
In late June, Damman tweeted that he had received a takedown notice from Twitter, but the bot continued to send out Tweets through the semifinal games earlier this week. FIFA, soccer’s governing body, and the TV networks that own the rights to the games have been vigilant about removing unofficial GIFs, videos, and images of the World Cup games.
At Recode, Peter Kafka, who first wrote about @ReplayLastGoal being removed, questioned how Twitter will handle instances like this in the future:
I do wonder how Twitter will approach this stuff for other big global sports events. Right now, the company’s approach is to leave anything and everything up until it gets DMCA takedown requests, more or less like YouTube. Unlike YouTube, however, Twitter doesn’t seem to have an expedited process available to let copyright holders pull stuff off the site.
In ReplayLastGoal’s case, for instance, it seems to have taken Twitter 11 days to take the account offline.
But Twitter is also the same company that’s basing much of its sales strategy around the idea that it’s working with TV programmers, not against them. One of its highest-profile ad products, for instance, lets programmers take sports highlight reels and turn them into ads minutes after they run on TV. That pitch may be harder to make if those highlights are already up on Twitter.
In yesterday’s paper, The Dallas Morning News announced it was ending its experiment with a “premium” site. We wrote about it back in October, when it launched. (The premium strategy replaced a more traditional paywall, albeit one that had hard categories of free vs. paid stories, not the metered approach most American dailies have taken. And to get my disclosure out of the way, I worked at the Morning News for eight years and root for it still.)
The idea was that, rather than shut a lot of good content off from the free web, maybe you could increase digital revenue by creating a “premium” experience — a nicer look, getting rid of ads — and charging people 12 bucks a month to access it. As the DMN story notes, the premium experience was also “launched with promises of personalization and loyalty programs to come later,” which never really materialized.
I appreciate the Morning News’ willingness to stray from the newspaper norm in seeking revenue. It was an early leader in wringing more revenue out of its most loyal print subscribers; it’s tried out multipleapproaches to a free targeted daily product; that paywall strategy went against the grain. But you could see this result coming a Texas mile away. The premium site was not some beautiful, immersive experience — it was aggressively ugly and a pain to navigate. I found it actively worse than the non-premium site, and far from good enough an offering to drive payment. From last fall:
All Dallas Morning News articles: free! All articles laid out onto rectangles with photo backgrounds: $143/year http://t.co/U5ZZWk3c4e
• It’s hard to know what lessons were learned by The News because so much of what went wrong here was a result of disorganization instead of strategy. The central question the premium site tried to answer — would people money for a better web experience (what they internally called a “velvet-rope experience”) — was never answered because that experience never materialized. This was partly due to the suicidal timeline the project employed (which caused all other digital projects current and future to be neglected) but also because some elements were never rolled out. The experiment was supposed to have three components (what Dyer would often call “three legs of the stool”): 1) a better looking site; 2) one with little-to-no ads; 3) one that offered significant subscriber perks. The third part — which was Dyer’s responsibility — never really happened. [I'd argue the first never happened either. Dyer here is chief marketing officer Jason Dyer. —Josh] They imagined offering Christmas card photos taken for you by Pulitzer-winning photogs, or game-watching parities with beat writers. They ended up offering T-shirts. That was part of the problem. The other:
• The marketing/sales folks who were effing this cat never got newsroom buy-in. Top newsroom folks were against the premium site from Day 1. Once the premium site went live and starting siphoning traffic (not much, but some) from the basic site, the newsroom freaked. Understandable, since you were diluting the newsroom’s only real measure of success. And even if you think big gray corporate newsrooms need disruption, you’re not going to convince them when your efforts fail spectacularly. The number of non-subscribers who actually came to the premium site, looked around, and said, “I’ll pay for this” was “a fingers-and-toes” number, I was told today.
• The News is not thinking right now about how to squeeze more money out of subscribers. It’s just trying to find a way to reach a mobile audience so it can THEN figure out how to then monetize it. The mobile efforts to which Dyer refers is just a mobile version of the premium site — I know, I know, at least this time everyone will get it for free. But there is a comprehensive, integrated (advertising/newsroom/marketing/subscription) strategy being put in place for a mobile-first platform that should start rolling out this fall and continue for a few years. It’s another valiant effort by the DMN to be nimble, to figure this new-media landscape out before it kills them. But first …
• They have to do what Dyer wrongly says they’ve done: Take valuable lessons from their failures. The DMN learned NOTHING from this it didn’t already know. The paper learned it with its paywall, and its tablet app, and when it tried to charge for high-school scores: People won’t pay for content that is ubiquitous, and the newsroom will (perhaps rightly) sabotage any effort that doesn’t get its reporters the biggest audience possible.
Sign up for our daily email for all the freshest future-of-journalism news in your inbox.