HOME
          
LATEST STORY
In Canada, newspapers’ attempts to experiment with ebooks haven’t seen much success
ABOUT                    SUBSCRIBE
July 3, 2014, 10:05 a.m.
Audience & Social
LINK: www.pewinternet.org  ➚   |   Posted by: Caroline O'Donovan   |   July 3, 2014

Pew Research Center’s Internet & American Life Project is out with a report today based on a survey of leaders in information technology. Earlier, Pew asked experts what digital life would look like in 2025; today’s update focuses on potential threats to our information network.

The concerns are broken down into four categories:

1) Actions by nation-states to maintain security and political control will lead to more blocking, filtering, segmentation, and balkanization of the Internet.

2) Trust will evaporate in the wake of revelations about government and corporate surveillance and likely greater surveillance in the future.

3) Commercial pressures affecting everything from Internet architecture to the flow of information will endanger the open structure of online life.

4) Efforts to fix the TMI (too much information) problem might over-compensate and actually thwart content sharing.


The first two categories have broad implications for journalists — for example, their safety, and how they practice their craft. Journalists will simultaneously have to work to push back against government and commercial control creep while learning how to resist surveillance, as will everyone else.

The second two categories, however, speak more directly and immediately to the world of digital publishing. For example, as those commercial pressures mount, it will become increasingly important to make sure that news and information providers understand and have access to the heavy-duty tools of the Internet, so that rapidly consolidating power and money do not overwhelm them:

Glenn Edens, director of research in networking, security, and distributed systems at PARC, said, “Network operators’ desire to monetize their assets to the detriment of progress represents the biggest potential problem. Enabling content creators to easily and directly reach audiences, better search tools, better promotion mechanisms and curation tools — continuing to dismantle the ‘middle men’ is key.”

The fluidity of content was also a major concern for some, who look forward to considering access a right, and believe that sharing is the antidote to a fractured Internet:

Clark Sept, co-founder and principal of Business Place Strategies Inc., wrote, “Online content access and sharing will be even better and easier by way of personal digital rights access. Sharing freely will be recognized as having greater long-term economic value than strictly limited controls over ‘intellectual property.’”

Or, more briefly:

Jim Harper, director of information policy studies at the Cato Institute, responded, “People are going to get what they want, and they want to share content.”

If the risk of an information ecosystem in which content is produced and controlled by a few economically powerful players isn’t clear, Doc Searls explains:

“What the carriers actually want — badly — is to move television to the Net, and to define the Net in TV terms: as a place you go to buy content, as you do today with cable. For this they’ll run two-sided markets: on the supply side doing deals with “content providers” such as Hollywood and big publishers, and on the demand side by intermediating the sale of that content. This by far is the most serious threat to sharing information on the Net, because it undermines and sidelines the Net’s heterogeneous and distributed system for supporting everybody and everything, and biases the whole thing to favor a few vertically-integrated ‘content’ industries.”

But there’s a flip side to all that sharing and fluid content, says Mike Roberts, former ICANN leader:

“God knows what will happen to the poor authors. John Perry Barlow says ‘information wants to be free,’ which pursued to the ultimate, pauperizes the authors and diminishes society thereby. There has been recent active discussion of this question on the ICANN former director list.”

Despite these looming systemic threats, many respondents were concerned about a challenge we already face everyday — how to efficiently locate the information that we want, and how to guarantee that it will continue to be served to us. Writes Michael Starks, an information science professional:

“The challenge will be in separating the wheat from the chaff. Will people who can create, edit, judge, find and curate content for others become valued for those skills? If so — and if that value is reflected in the salaries those people receive — then highly networked populations will have greater access to better content as well as more content.”

According to the report’s authors, complaints about the sorting process of the future included but were not limited to: “algorithms often categorize people the wrong way and do not suit their needs; they do not change as people change; search algorithms are being written mostly by corporations with financial interests that could sway the ways in which they are being written; search algorithms can be gamed by certain outside interests to sway searches to their advantage.” Susan Etlinger of the Altimeter Group added additional concerns:

“With regard to content, the biggest technical challenge will continue to be filter failure; algorithms today just cannot keep up with the number and type of signals that provisionally predict what a person will want at a certain point in time. There are so many barriers: multiple devices, offline attribution and of course simple human changeability. We will continue to see a push and pull with regard to privacy. People will continue to adapt, but their expectations for control and relevance will also increase.”

A few survey respondents went so far as to imagine the kinds of systems they believe might, or at least should, come into place to help us deal with the deluge of data. Marc Rotenberg, president of the Electronic Privacy Information Center, expressed concerns over consolidated control of the tools we use to find information:

“Currently, approximately 70% of Internet users in the U.S. and 90% in Europe obtain information by going through the search services of one company. This needs to change. There should be many information sources, more distributed, and with less concentration of control. So, I am hoping for positive change. We need many more small and mid-size firms that are stable and enduring. The current model is to find an innovation with monetizing potential, incorporate, demonstrate proof of concept, sell to an Internet giant, and then walk away. This will not end well.”

What could one type of small firm helping redistribute control of search and distribution?

Jonathan Grudin, principal researcher at Microsoft Research, predicted, “To help people realize their fullest potential, an industry of ‘personal information trainers’—by analogy to personal trainers for fitness—will form to help people find and access information that is interesting and useful for them. Reference librarians played this role when we went to the information repository called a library. As the volume of information continues to grow exponentially, personal information trainers will help us with the much more daunting task of creating a virtual dashboard to access the information of value to us, much of which we did not know was out there.”

Ultimately, writers Robert Cannon, U.S. Internet law expert, we are past “the initial utopian introduction that greets the technology with claims of world peace,” and past the era of competition. “In the information era,” he writes “we have moved into the era of consolidation.” But there may yet be hope:

“Unlike other cycles where the era of consolidation also raised barriers to entry, in the modern information era, the barriers to entry still remain low. But this can change as conduit becomes entangled with content or service. […] This is the core concern of the Net neutrality debate: Will the Internet of the future look like the radio market or the telegraph market after consolidation, with few players controlling content — or will it continue to look like the never-ending marketplace of ideas?”

Show tags Show comments / Leave a comment
LINK: mobilemediamemo.com  ➚   |   Posted by: Joshua Benton   |   November 18, 2014

You may known Cory Bergman as the cofounder (and now general manager) of the innovative mobile app Breaking News, or as the cofounder of Seattle hyperlocal network Next Door Media. But now he’s got a new email newsletter, Mobile Media Memo, that I suspect a number of Lab readers will be interested in. (Subscribe here.) The first issue just went out and features some smart thoughts on a pet peeve of mine: Journalists’ obsession with equating length and quality.

In the world of media, longer content is heralded as higher quality. A six-minute piece is more prestigious than a minute-twenty package. Full-length features trump shorts. Shows beat webisodes. Two-thousand words are better than two hundred. There are lots of reasons for the industry bias toward longer content. Legacy platforms and business models. Prominence and awards. Creative freedom and journalistic context. Ask just about anyone in the content business, and they prefer longer work.

[…]

That doesn’t mean there’s not a market for longer-form content on mobile. I read books and watch movies on my iPhone while flying back and forth from NYC. Tablet users, especially in evening and nighttime hours, read longer-form stories and binge on Netflix. But on average across the mobile universe, shorter content is consumed more. It’s also the gateway to longer forms of content: social apps act as recommendation engines for your attention. That’s how Facebook’s app became the “home page” of mobile, accounting for more time spent than all mobile browsers combined.

[…]

Part of the problem is the industry’s fixation on “time spent” as an engagement metric. I remember a Poynter study a couple years ago that discovered the average “bail out” point on a tablet is 78.3 seconds of reading. The recommendation? Write the story in such a way that gets users to keep reading. The obvious solution: write a shorter story.

It’s often better to maximize “time saved” rather than time spent, especially on a per session basis. Imagine, for example, that you can get the nugget of a 2-minute video in a 24-second clip, or 80% of the value in 20% of the time. For most mobile users, that’s more delightful than watching the full 2 minutes. The more delighted the users, the more frequently they’ll return, which all adds up to a lot of time spent/user at the end of the month.

Permalink
LINK: ww2.cfo.com  ➚   |   Posted by: Joshua Benton   |   November 17, 2014

CFO magazine has an interview with Victoria Harker, the chief financial officer of Gannett, which is one of a number of news companies in various stages of splitting off its print properties (newspapers, mostly) from its broadcast and digital ones. The positive spin is that it’ll let each type of company pursue the best approach without strategy tax; the negative spin is that it’s sending print off onto an ice floe where its continued decline will no longer infect the other side of the business. This question would seem to position Gannett as a candidate for the newspaper industry rollup (or mop-up) many have been anticipating (emphasis mine):

Q: Some people praise Gannett because it isn’t burdening the newspaper spin-off with debt, as other media companies have done. Others criticize Gannett for not including, say, Cars.com in the spin-off to provide more advertising revenue. How do you respond to these views?

A: Relative to the debt, we felt very strongly that the publishing segment — which has its own digital properties, by the way — needed to have the kind of capital structure that will enable them to be a consolidator in the industry, should that be the strategic decision they make. They have produced a very efficient model for running the newsroom of today and tomorrow. So we didn’t want to saddle them with a lot of debt. We wanted to enable a good revenue stream, a good cost structure, and good cash production, so they can do the kinds of things they need to do to create longevity within that business.

Relative to Cars.com, we will have affiliation agreements with the publishing business for five years after the deal closes. In our way of thinking it’s the best of both worlds, in that Cars.com will live in the broadcast and digital company, where it will have the right type of capital structure and investment, while the publishing side will continue to be able to leverage that relationship.

You know, we spent a lot of time with investors during the last 10 days, and a number of them asked how they can become an investor on both sides of the house once we spin. So it’s not that everybody wants to go into growth and be in broadcast and digital. We have a number of investors saying, “We’re very interested in publishing, this is an interesting story for the value side of our investment house.” And it’s a dividend-producing entity, which is very attractive to them.

Getting external capital for that sort of move will likely only get tougher, so flexibility on the balance sheet is important.

Permalink
LINK: blog.pastpages.org  ➚   |   Posted by: Joshua Benton   |   November 13, 2014

Hopefully you know about PastPages, the tool built by L.A. Times data journalist Ben Welsh to record what some of the web’s most important news sites have on their homepage — hour by hour, every single day. Want to see what The Guardian’s homepage looked like Tuesday night? Here you go. Want to see how that Ebola patient first appeared on DallasNews.com in September? Try the small item here. It’s a valuable service, particularly for future researchers who will want to study how stories moved through new media. (For print media, we have physical archives; for digital news, work even a few years old has an alarming tendency to disappear.)

Anyway, Ben is back with a new tool called StoryTracker, “a set of open source tools for archiving and analyzing news homepages,” backed in part by the Reynolds Journalism Institute at Mizzou.

It offers a menu of options, documented here, for creating an orderly archive of HTML snapshots, extracting hyperlinks with a bonus set of metadata that captures each link’s prominence on the page and visualizing a page’s layout with animations that show changes over time.

The potential uses for researchers are obvious, but I could also imagine plenty of realtime uses. Tracking your own homepage over time, you could get good data on how the granular movement of stories there correlates with traffic over time. (To ask questions like: Is the top slot more or less valuable on weekends or overnight than during the day Monday to Friday?) You could track your competition’s homepages to get hard data on what stories they’re pushing hardest. And unlike the base PastPages, which saves screenshots of homepages, StoryTracker gets at the HTML to determine what stories are where. It’s all open source, so have at it. (Here’s a sample analysis to see what sources the Drudge Report links to most.)

Ben presented StoryTracker at a conference at RJI earlier this week; here’s the video and his slide deck.

Permalink
LINK: www.nber.org  ➚   |   Posted by: Joshua Benton   |   November 10, 2014

Interesting new study (PDF) from Stefano DellaVigna of UC Berkeley and Johannes Hermle of the University of Bonn. From the abstract (emphasis mine):

Media outlets are increasingly owned by conglomerates, inducing a conflict of interest: a media outlet can bias its coverage to benefit companies in the same group. We test for bias by examining movie reviews by media outlets owned by News Corp. — such as the Wall Street Journal — and by Time Warner — such as Time.

We use a matching procedure based on reported preferences to disentangle bias due to conflict of interest from correlated tastes. We find no evidence of bias in the reviews for 20th Century Fox movies in the News Corp. outlets, nor for the reviews of Warner Bros. movies in the Time Warner outlets. We can reject even small effects, such as biasing the review by one extra star (out of four) every 13 movies. We test for differential bias when the return to bias is plausibly higher, examine bias by media outlet and by journalist, as well as editorial bias. We also consider bias by omission: whether the media at conflict of interest are more likely to review highly-rated movies by affiliated studios.

In none of these dimensions do we find systematic evidence of bias. Lastly, we document that conflict of interest within a movie aggregator does not lead to bias either.

(For an interesting and somewhat contradictory perspective, you might enjoy this great piece from the summer on the history of Entertainment Weekly and its role within the various iterations of Time Warner.)

So why don’t movie reviews get skewed to support the corporate parent? DellaVigna and Hermle suggest it’s the high degree of competition: “We conclude that media reputation in this competitive industry acts as a powerful disciplining force.” In other words, there are plenty of voices available on any given movie, so readers who think the fix is in for Horrible Bosses 2 would find it easy to switch to some other source of reviews.

(I’d argue another factor is that inaccurate movie reviews exact a more concrete cost to readers — a wasted movie ticket and a lame night out — than most other news products. You generally don’t lose money and time if a city council story has a fact wrong. That direct tie to consumer behavior probably incentivizes more ready switching.)

If competition on a given subject discourages bias, you can imagine the opposite would be true too — less competition, more bias. You can certainly read that as discouraging: After all, there are many beats that have fewer professional reporters covering them than 10 or 20 years ago. But you could also read it as encouraging, since social media and personal publishing can bring corrective voices to the fore. In all cases, it seems to be a critical mass of interested voices that can help tamp down (or at least surface) bias.

Permalink
LINK:   ➚   |   Posted by: Joseph Lichterman   |   November 6, 2014

News video aggregator Watchup just announced a new funding round, a $2.75 million investment led by Tribune Media, the broadcast arm of the former Tribune Company. With this round, Watchup has now raised $4.25 million since its launch in 2012. McClatchy along with prior funders the Knight Enterprise Fund, the Stanford-StartX Fund, and businessmen Ned Lamont, Gordon Crovitz, and Jim Friedlich are also investors.

“We are so excited about this round because we have brought together a select group of media innovators who are willing to contribute their industry knowledge and their content to help us reinvent the video news experience,” Adriano Farano, Watchup’s co-founder and CEO said in a statement.

Watchup (which started as a Knight News Challenge winner) is an app that allows users to build personalized newscasts by pulling video from dozens of global and local news outlets. Most of the video is pulled in through public YouTube channels, but Watchup also has agreements with The Washington Post, The Wall Street Journal, PBS NewsHour, and other organizations to directly provide video to the app.

This round marks the latest in a series of investments legacy news organizations are making in news startups. In September, Vice Media received $500 million in funding, including $250 million from A&E Networks, which is owned by Hearst and Disney. Last month, The New York Times Co. and German publishing behemoth Axel Springer said they were investing $3.7 million into Blendle, a Dutch news reading platform where readers pay by the article.

Tribune Media, for its part, is investing in Watchup because it “extends our vision of expanding the reach of quality local news content,” Larry Wert, Tribune’s president of broadcasting said in a statement. Tribune Media owns or operates 42 different local broadcast stations.

Permalink
 
Join the 15,000 who get the freshest future-of-journalism news in our daily email.
In Canada, newspapers’ attempts to experiment with ebooks haven’t seen much success
A number of papers across the country started ebook programs in the early part of this decade, repurposing their archives or producing new work. They haven’t been the moneymakers some had hoped.
How a virus demanding a bitcoin ransom almost destroyed a public radio station’s archives
But for a fluke in its system, Missouri’s KBIA could’ve lost all its files dating back to 2006.
What’s the right news experience on a phone? Stacy-Marie Ishmael and BuzzFeed are trying to figure it out
“Nobody has to read you. You have to earn that. You have to respect people’s attention.”
What to read next
718
tweets
Ken Doctor: The New York Times’ financials show the transition to digital accelerating
The numbers may look flat, but they contain a continuing set of ups and downs. Up next: executing on a year’s worth of launches.
540Here’s some remarkable new data on the power of chat apps like WhatsApp for sharing news stories
At least in certain contexts, WhatsApp is a truly major traffic driver — bigger even than Facebook. Should there be a WhatsApp button on your news site?
502Controlled chaos: As journalism and documentary film converge in digital, what lessons can they share?
Old and new media types from journalism, documentary, and technology backgrounds gathered at MIT to share practices and discuss mutual concerns.
These stories are our most popular on Twitter over the past 30 days.
See all our most recent pieces ➚
Fuego is our heat-seeking Twitter bot, tracking the links the future-of-journalism crowd is talking about most on Twitter.
Here are a few of the top links Fuego’s currently watching.   Get the full Fuego ➚
Encyclo is our encyclopedia of the future of news, chronicling the key players in journalism’s evolution.
Here are a few of the entries you’ll find in Encyclo.   Get the full Encyclo ➚
Frontline
California Watch
EveryBlock
Wikipedia
International Consortium of Investigative Journalists
Connecticut Mirror
Hechinger Report
Honolulu Civil Beat
The New Republic
Demand Media
Center for Public Integrity
BBC News