Nieman Foundation at Harvard
These national journalists are building a local site to bring a different kind of news to East Texas
ABOUT                    SUBSCRIBE
July 6, 2010, 10 a.m.

The ASCAP example: How news organizations could liberate content, skip negotiations, and still get paid

Jason Fry suggested in a post here last week that current paywall thinking might be just a temporary stop along the way to adoption of “paytags — bits of code that accompany individual articles or features, and that allow them to be paid for.” But how? As Fry recognizes, “between wallet friction and the penny gap, the mechanics of paytags make paywalls and single-site meters look like comparatively simple problems to solve.”

I suggested a possible framework for a solution during a couple of sessions at the conference “From Blueprint to Building: Making the Market for Digital Information,” which took place at the University of Missouri’s Reynolds Journalism Institute June 23-25. Basically, my “what-if” consisted of two questions:

  1. What if news content owners and creators adopted a variation on the long-established ASCAP-BMI performance rights organization system as a model by which they could collect payment for some of their content when it is distributed outside the boundaries of their own publications and websites?
  2. And, taking it a step further, what if they used a variant of Google’s simple, clever, and incredibly successful text advertising auction system to establish sales-optimizing pricing for such content?

News publishers have been tying themselves in knots for the last few years deciding whether or not to charge readers for content, and if so, how much and in what fashion — micropayments, subscriptions, metered, freemium and other ideas have all been proposed and are being tested or developed for testing.

As well, publishers have complained about the perceived misuse of their content by aggregators of all stripes and sizes, from Google News down to neighborhood bloggers. They’ve expressed frustration (“We’re mad as hell and we are not going to take it anymore,” Associated Press chair Dean Singleton said last year), and vowed to go after the bandits.

But at the same time, many publishers recognize that it’s to their advantage to have their content distributed beyond the bounds of their own sites, especially if they can get paid for it. When radio was developed in the 1920s, musicians and music publishers recognized they would benefit from wider distribution of their music through the new medium, but they needed a way to collect royalties without each artist having to negotiate individually with each broadcaster.

A model from music

That problem was solved by using a non-profit clearinghouse, ASCAP (American Society of Composers, Authors and Publishers), which had been formed in 1914 to protect rights and collect royalties on live performances. Today the performance-rights market in the U.S. is shared between ASCAP, BMI (Broadcast Music Incorporated, founded by broadcasters rather than artists) and the much smaller SESAC (formerly the Society of European Stage Authors & Composers). Using digital fingerprinting techniques, these organizations collect royalties on behalf of artists whose works are performed in public venues such as restaurants and shopping centers as well as on radio and television stations and streaming services such as Pandora.

Publishers have put a lot of effort into trying to confine news content to tightly-controlled channels such as their own destination websites, designated syndication channels, apps, and APIs in order to control monetization via advertising and direct user payments. But when content moves outside those bounds, as it can very easily, publishers have no way to regulate it or collect fees — so they cry foul and look for ways to stop the piracy or extract payments from the miscreants.

Among the content-protection schemes, AP is rolling out News Registry, which it touts as a way of at least tracking the distribution of content across the web, whether authorized or not, and Attributor offers “anti-piracy” services by “enforcement experts” to track down unauthorized use of content. But for now, content misuse identified by these systems will require individual action to remove it or force payment. In the long run, that’s not a viable way to collect royalties.

Suppose, instead, that news publishers allowed their content to it be distributed anywhere online (just as music can be played by any radio station) as long as it were licensed by a clearinghouse, similar to ASCAP and BMI, that would track usage, set prices, and channel payments back to the content creator/owner?

To do this, perhaps the paytags Fry suggested are needed, or perhaps publishers can learn from the music industry and use the equivalent of the digital fingerprints that allow ASCAP’s MediaGuide to track radio play. (The basic technology for this is around: AP’s News Registry uses hNews microtags as well as embedded pixels (“clear GIFs”); Attributor’s methodology is closer to the digital fingerprinting technique.)

How it could work

The system for broadcast and performance music payments is a three-way exchange consisting of (a) artists and composers, (b) broadcasters and performance venues, and (c) performance rights organizations (ASCAP and BMI).

In the news ecosystem the equivalents would be (a) content creators and owners, (b) end users including both individual consumers and “remixers” (aggregators, other publishers, bloggers, etc.); and (c) one or more content clearinghouses providing services analogous to those of ASCAP and BMI.

The difference between a news payments clearinghouse and the music industry model would be in scale, speed and complexity. In the news ecosystem, just as in the music world, there are potentially many thousands of content creators — but there are millions of potential end users, compared to a manageable number of radio stations and public performance venues paying music licensing fees. And there are far more news stories than musical units; they’re distributed faster and are much shorter-lived than songs. In the radio and public performance sphere, music content still travels hierarchically; that was true in the news business 20 years ago, but today news travels in a networked fashion.

To handle the exchange of rights and content in this vastly more complex environment, a real-time variable pricing model could be developed, benefiting both the buyers and sellers of content. Sellers benefit because with variable pricing or price discrimination, sales and revenue are maximized, since content goods are sold across the price spectrum to various buyers at the price each is willing to pay — think of the way airline seats are sold. Buyers benefit because they can establish the maximum price they are willing to pay. They may not be able buy at that price, but they are not subject to the take-it-or-leave-it of fixed pricing.

When it comes to news content, a variable pricing strategy was suggested last year by Albert Sun, then a University of Pennsylvania student; now a graphics designer with The Wall Street Journal. (Sun also wrote a senior thesis on the idea called “A Mixed Bundling Pricing Model for News Websites.”) The graphs on his post do a good job showing how a price-discrimination strategy can maximize revenue; it was also the subject of one of my posts here at the Lab.

A well-known real-time variable pricing arrangement is the Google AdSense auction system, which establishes a price for every search ad sold by Google. Most of these ads are shown to users at no cost to the advertisers; they pay only when the user clicks on the ad. The price is determined individually for each click, via an algorithm that takes into account the maximum price the advertiser is willing to pay; the prices other advertisers on the same search page are willing to pay; and the relative “Quality Score” (a combination of clickthrough rate, relevancy and landing page quality) assigned to each advertiser by another Google. It works extraordinarily well, not only for advertisers but for Google, which reaps more than $20 billion in annual revenue from it.

Smart economist needed

What’s needed in the news ecosystem is something similar, though quite a bit more complex. Like the Google auction, the buyer’s side would be simple: buyers (whether individuals or remixers such as aggregators) establish a maximum price they are willing to pay for a particular content product — this could be an individual story, video, or audio report, or it could be a content package, like a subscription to a topical channel. This maximum price is determined by an array of factors that will be different for every buyer, but may include timeliness, authoritativeness, relevance to the buyer’s interests, etc., and may also be affected by social recommendations or the buyer’s news consumption habits. But for the purposes of the algorithm, all of these factors are distilled in the buyer’s mind into a maximum price point.

The seller is the content creator or owner who has agreed to share content through the system, including having remixers publish and resell it. Sellers retain ownership rights, and share revenue with the remixer when a transaction takes place. The price that may be acceptable to a content owner/seller will vary (a) by the owner’s reputation or authority (this is analogous to Google’s assignment of a reputation score to advertisers), and (b) by time — since generally, the value of news content will drop quickly within hours or days of its original publication.

The pricing algorithm, then, needs to take into account both the buyer’s maximum price point and the seller’s minimum acceptable price based on time and reputation; and at least two more things: (a) the uniqueness of the content — is it one of several content items on the same topic (multiple reports on an event from different sources), or is it a unique report not available elsewhere (a scoop, or an enterprise story) — and (b) the demand for the particular piece of content — is it popular, is it trending up, or has it run its course?

The outcome of this auction algorithm would be that different prices would be paid by different buyers of the same content — in other words, sales would occur at many points along the demand curve as illustrated in Sun’s post, maximizing revenue. But it’s also likely that the system would establish a price of zero in many cases, which is an outcome that participating publishers would have to accept. And of course, many remixers would choose to offer content free and step into the auction themselves as buyers of publication rights rather than as resellers.

In my mind, the actual pricing algorithm is still a black box, to be invented by a clever economist. For the moment, it’s enough to say that it would be an efficient, real-time, variable pricing mechanism, maintained by a clearinghouse analogous to ASCAP and BMI, allowing content to reach end users through a network, rather than only through the content creator’s own website and licensees. Like ASCAP and BMI, it bypasses the endless complexities of having every content creator negotiate rights and pricing with every remixer. The end result would be a system in which content flows freely to end users, the value of content is maximized, and revenue flows efficiently to content owners, with a share to remixers.

Clearly, such a system would need a lot of transparency, with all the parties (readers, publishers, remixers) able to see what’s going on. For example, if a multiple news sources have stories on the same event, they might be offered to a reader at a range of prices, including options priced above the reader’s maximum acceptable price.

Protecting existing streams

Just as ASCAP and BMI play no role when musicians sell content in uncomplicated market settings the musicians can control — for example, concert tickets, CD sales, posters, or other direct sales — this system would not affect pricing within the confines of the content owner’s own site or its direct licensees. But by enabling networked distribution and sales well beyond those confines, it has the potential to vastly increase the content owner’s revenue. And, the system need not start out with complex, full-blown real-time variable pricing machinery — it could begin with simpler pricing options (as Google did) and move gradually toward something more sophisticated.

Now, all of this depends, of course, on whether the various tentative and isolated experiments in content pricing bear fruit. I’m personally still a skeptic on whether they’ll work well outside of the most dominant and authoritative news sources. I think The New York Times will be successful, just as The Wall Street Journal and Financial Times have been. But I doubt whether paywalls at small regional newspapers motivated by a desire to “protect print” will even marginally slow down the inevitable transition of readers from print to digital consumption of news.

A better long-term strategy than “protect print” would be to move to a digital ecosystem in which any publisher’s content, traveling through a network of aggregators and remixers, can reach any reader, viewer or listener anywhere, with prices set efficiently and on the fly, and with the ensuing revenue shared back to the content owner. The system I’ve outlined would do that. By opening up new potential markets for content, it would encourage publishers to develop higher-value content, and more of it. The news audience would increase, along with ad revenue, because content would travel to where the readers, listeners or viewers are. Aggregators and other remixers would have be incentivized to join the clearinghouse network. Today, few aggregators would agree to compensate content owners for the use of snippets. But many of them would welcome an opportunity legitimately to use complete stories, graphics and videos, in exchange for royalties shared with the content creators and owners.

Granted, this system would not plug every leak. If you email the full text of a story to a friend, technically that might violate a copyright — just like sharing a music file does — but the clearinghouse would not have the means to collect a fee (although the paytag, if attached, might at least track that usage). There will be plenty of sketchy sites out there bypassing the system, just as there are sketchy bars that have entertainment but avoid buying an ASCAP license.

But a system based on a broadly-agreed pricing convention is more likely to gain acceptance than one based on piracy detection and rights enforcement. Like ASCAP’s, the system would require a neutral, probably nonprofit, clearinghouse.

How could such an entity be established, and how would it gain traction among publishers, remixers and consumers? Well, here’s how ASCAP got started: It was founded in 1914 by Victor Herbert, the composer, who was well-connected in the world of musicians, composers, music publishers and performance venues, and who had previously pushed for the adoption of the 1909 Copyright Act. Herbert enlisted influential friends like Irving Berlin and John Philip Sousa.

Today, just as a few outspoken voices like Rupert Murdoch are moving the industry toward paywalls, perhaps a few equally influential voices can champion this next step, a pricing method and payments clearinghouse to enable publishers to reap the value of content liberated to travel where the audience is.

Acknowledgments/disclosures: The organizer of the conference where I had the brainstorm leading to this idea, Bill Densmore, has spent many years thinking about the challenges and opportunities related to networked distribution, payment systems, and user management for authoritative news content. A company he founded, Clickshare, holds patents on related technology, and for the last two years he has worked at the University of Missouri on the Information Valet Project, a plan to create a shared-user network that would “allow online users to easily share, sell and buy content through multiple websites with one ID, password, account and bill.” Densmore is also one of my partners in a company called CircLabs, which grew out of the Information Valet Project. The ideas presented in this post incorporate some of Densmore’s ideas, but also differ in important ways including the nature of the pricing mechanism and whether there’s a need for a single ID.

Photo by Ian Hayhurst used under a Creative Commons license.

POSTED     July 6, 2010, 10 a.m.
Show comments  
Show tags
Join the 35,000 who get the freshest future-of-journalism news in our daily email.
These national journalists are building a local site to bring a different kind of news to East Texas
The Tyler Loop fashions itself as a data-savvy, digital alt-weekly for the growing, increasingly diverse city of Tyler.
With its Amazon-inspired pilot project, Panoply used listener feedback to help decide its new shows
“We’re basically asking [listeners]: Are we way off base? Are we a little off base? You tell us before we make a whole season of something drive you away.”
This is how The New York Times is using bots to create more one-to-one experiences with readers
“I’m not worried about this technology driving the humanity out of journalism. I’m really excited about the promise of technology bringing more humanity to journalism.” Also: a Michael Barbaro bot.