Twitter  Six things to consider about the new Los Angeles Register nie.mn/1jLLDsw  
Nieman Journalism Lab
Pushing to the future of journalism — A project of the Nieman Foundation at Harvard

Fair Syndication Consortium: News orgs’ new way to confront Google?

Remember? Two months ago, Associated Press chairman Dean Singleton said his organization would take a firm stand against unlicensed use of its content and that of its members. “We are mad as hell,” he declared at the AP’s annual meeting in San Diego, “and we’re not going to take it any more.”

Singleton is a newspaper man. His first reporting gig came as a teenager in Graham, Texas, and now he’s in charge of MediaNews Group, the nation’s fourth-largest newspaper company. So, of course, he knew that channeling Howard Beale was certain to find its way into every article and blog post about the speech. That’s why he said it, and that’s how most people learned of the AP’s supposed crackdown on piracy of its work. (Watch the meeting here, or listen to the magic words below.)

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Here’s what followed: Google was said to be a major target of the speech, even though Singleton didn’t mention the company or even the phrase “search engine.” News aggregators were also assumed to be in the crosshairs, although The Huffington Post, like Google, is a paying customer of the AP. Everyone was very angry, and nuance seemed to be lost amid all the saber-rattling. Since then, the AP has done little to clarify whom, exactly, its mad at or how it plans to address that anger.

Shift the tale to New York, three weeks later, at the headquarters of Thomson Reuters, where a slew of major news organizations — but not the AP — gathered to consider a new tact in combatting online piracy. Reuters and Politico were already on board. So was every member of the Magazine Publishers Association.

They proposed banding together as the Fair Syndication Consortium with an innovative approach to combatting the true tapeworms of the online news business: not Google, certainly, or Arianna Huffington, but wholesale copiers of content. The consortium is targeted, in part, at spam blogs — or splogs — that reprint news articles and posts in their entirety alongside cheap advertising. Splogs are typically automated, and the only human being involved is the one who gets a check at the end of the month.

What the consortium seeks to do is turn tapeworms into fungus. They don’t want to shut down splogs and their ilk, which would be a largely sisyphean task of enormous cost. Instead, the consortium is negotiating with the networks that serve ads against pirated content to negotiate a substantial share of that revenue.

Abusive sites, under this arrangement, could operate with legal cover and might proliferate as a result, so publishers would have to get used to the idea of their content appearing across the web, on servers they don’t control, amid page designs that only exist to sell cheap advertising. But it’s really just a cruder — or, you might say, more organic — form of traditional syndication. I hesitate to overhype this, but the concept, if not this particular application of it, has the potential to fundamentally shift how publishers conceive of distributing their content on the web.

There’s a company behind the consortium: Attributor, which crawls the Internet in search of copied content for a host of media companies (including, incidentally, the AP). At a meeting of newspaper executives in Chicago last week, Attributor CEO Jim Pitkow estimated that publishers are losing a total of $250 million annually to splogs and other sites that copy their content. (In a phone interview yesterday, he told me that estimate was based on a “conservative” CPM of roughly 25 cents, though for reasons we could discuss in the comments, I’m not convinced of their math.)

This is where Google, et al. enter the equation — in the role of partner rather than adversary. Attributor says that 94% of ads on splogs and other sites that pirate content are served by DoubleClick, Yahoo, or Google AdSense. Since Google owns DoubleClick, the consortium’s negotiations can focus on two companies with a strong interest in appearing supportive of intellectual property. But I should really stop explaining this because we obtained the slides that Pitkow presented in Chicago, and they make the consortium’s case pretty clear:

 

 
Nearly everyone I’ve spoken to with knowledge of the Chicago meeting, where newspaper companies were pitched on a variety of online business plans, says that Pitkow’s presentation of the Fair Syndication Consortium was by far the most popular. Attributor, which will be taking an undisclosed cut of the revenue, won’t announce who has signed onto the consortium for another few weeks, but expect it to include lots of major newspaper companies and blog networks. One reason the consortium has been so well-received is that publishers are taking on little risk.

“The worst-case scenario is that some publishers don’t make much money from this,” Pitkow told me. The best case? That not only do Google and Yahoo cooperate, opening up a new revenue stream for content producers, but publishers begin actively seeking wide distribution of their work with no compensation except a share of advertising.

The consortium’s negotiations with ad networks are ongoing, though Pitkow, of course, said they’ve been “encouraging.” I expect they’ll succeed, but a sticking point could be the share of revenue, which might vary from publisher to publisher. Pitkow said it could range from 25 to 75 percent, but it may not be his call.

The Associated Press, meanwhile, is not joining the consortium. (Pitkow wouldn’t comment.) Instead, they seem to be striking out on their own with a system that, at least in its outlines, sounds awfully similar to what I’ve described here. The only problem is that, in the meantime, many of AP’s members appear to have defected to the Fair Syndication Consortium.

And if that’s the case, then Singleton is surely mad as hell.

                                   
What to read next
newsrevbatsell
Jake Batsell    April 15, 2014
The daylong summit on new models for supporting journalism examines how the Texas Tribune diversified its funding, the injection of venture capital and private wealth into media, and the future of philanthropy for news.
  • http://www.fairsyndication.org Rich Pearson

    Zach,

    Great series and smart analysis throughout. Anticipating questions about the “math”:

    The ecpm assumptions are pretty tough to nail, particularly in this environment and it is admittedly not Attributor’s expertise.

    We do feel confident in the findings that the audience viewing publisher content on unauthorized sites (mostly “legitimate” sites but also including spam blogs) is 5x the audience on publishers’ own destination sites.

    Another reason why we feel the opportunity is conservative is the analysis was isolated on U.S. ips only.

    Open to other questions, and if anyone wants to see how widespread their content is being reused, they can go to fairshare.cc and signup for free.

  • Pingback: De Aanval op Google | Dode Bomen

  • Pingback: Attributor’s plan is a tourniquet on newspapers’ hemorrhage of ad revenue « Transforming the Gaz

  • D

    –Attributor CEO Jim Pitkow estimated that publishers are losing a total of $250 million annually to splogs and other sites that copy their content. (In a phone interview yesterday, he told me that estimate was based on a “conservative” CPM of roughly 25 cents…–

    As I’m sure your aware, this actually seems like a wildly aggressive CPM estimate for splog-like sites. I would be surprised if their CPMs were higher than a couple pennies.

    Looking at what Attributor does, I’m not surprised by his math. The CEO has every incentive to exaggerate the cost of the problem.

  • Siva Vaidhyanathan

    Why did you use the word “piracy?” We are talking about legal online aggregation and linking. Why would you use such a judgmental word?

    For now, what aggregators do is legal and what newspaper publishers are proposing is illegal (collaborate on setting prices).

    Please be careful.

  • Zachary M. Seward

    D, you’ve hit on one of my two questions about the $250 million figure, which is that AdSense CPMs on those sites are typically pennies, not quarters.

    My other concern is that they actually just estimated the lost ad revenue for the 25 most popular U.S. publishers and extrapolated from there. Of course they weighted for traffic and everything, but it’s my impression that splogs and other forms of full-content piracy are orders of magnitude more common among some of the big blogs like TechCrunch than any regional newspaper — that is, they are pirated at a rate disproportionate to their traffic. Even piracy of The New York Times seems far less common than piracy of, say, Gawker Media sites, likely because the Times doesn’t offer any full-content RSS feeds.

    That’s just my anecdotal impression, but I did ask Pitkow for his lost-ad-revenue estimate of just the top 25 publishers. He said he wasn’t cleared to release that information, which is fair enough, but it would help clarify this issue. (Am I making sense here? I’m writing quickly on a cramped bus that smells faintly of garbage, and I can only barely see the screen of my laptop, so who knows!)

    Now, Pitkow was kind enough to go over the $250 million figure with me several times, and he did acknowledge that it’s a very rough estimate. If it turned out that piracy is only really common among the top publishers, that wouldn’t defeat Attributor’s premise. It would just mean that the consortium would only be of real benefit to sites like TechCrunch, Gawker Media, and The Huffington Post. No problem there — unless, of course, you’re one of the newspaper companies they were pitching in Chicago.

    Thanks, Rich, for diving in first with those clarifications. (For everyone else: Rich is VP of marketing at Attributor.)

    And, Siva, I — and, more to the point, the consortium — are not talking about “legal online aggregation and linking.” The whole point of this post is to make that distinction. —Zach

  • Marcus

    Wow.. this piece sure is loaded and slanted… where is the comment from the other side.. the supposed aggregators. I certainly hope that none of these newspapers aggregate. Oh wait they do all the time!

    Just look at WSJ, DOW, or just about any other paper. Even the Associated Press aggregates.

    and lets not forget pesky things like fair use and illegal combine law (antitrust to US types).

  • http://www.fairsyndication.org Rich Pearson

    Zach – no problem and fair points all around. The proof will obviously be in the revenue that is delivered. Until then, it is all speculation :-)

    And just to reiterate your point as I don’t think Marcus understood – the consortium is focused on helping publishers capture their fair share of *full copy* reuse, removing the fair use question.

  • Pingback: Minnesota News Council » Blog Archive » New Economic Models for News

  • http://www.wordyard.com Scott Rosenberg

    i’ve never understood this “five times more people see the content elsewhere than at its site of original publication” figure. It makes no sense, particularly if this conversation is about splogs. Splogs scrape a little traffic here and there from the bottom of the Google barrel, but there’s no way that, you know, five times more people read the NYT’s front-page story on splogs than at NYTimes.com.

    If the 5-times figure is really talking about “people who see a headline and summary on Huffington Post” then it still sounds inflated but it might make a little more sense. But if that’s what we’re talking about then this doesn’t represent traffic (and dollars) that Attributor — which Rich Pearson says is focusing solely on full-content reuse — can win back for publishers.

  • Zachary M. Seward

    I don’t feel like I have any way to judge the claim, Scott, but I share your skepticism. (I’m more enthusiastic about the concept, the spirit of the consortium, and the potential for proactive syndication.)

    Rich Pearson first mentioned the 5x stat to me in an interview on April 23, when, at least according to my notes, he didn’t bring up the $250 million or any other dollar figure. I think they prepared that for the Chicago meeting as a more dramatic way of illustrating the problem — even if it’s neither dramatic nor a problem. To Attributor’s credit, they’ve been pretty clear about how rough their estimates are: In the slides above, the first two of three outcomes in their “scenario analysis” are failures.

    This post was written on June 5. What I found most puzzling about today’s Times story was its conflation of Attributor and the Fair Syndication Consortium. The former is running the latter, but even if the consortium is a flop, Attributor will still profit from the fees they already charge to track content usage across the web. So they’ve got a pretty sound business plan in all this, even if newspapers don’t. —Zach

  • Pingback: Let’s focus on innovation, not protection « Pursuing the Complete Community Connection

  • Pingback: Is There Really A ‘Piracy’ Problem For Newspapers? | Nuze.me

  • Pingback: Is There Really A ‘Piracy’ Problem For Newspapers? | Pixseekers

  • Pingback: Is There Really A ‘Piracy’ Problem For Newspapers? | PHP Hosts

  • Pingback: “Silly Season” Summer Roundup: Squeezing the Value From Online Content « J-School: Educating Independent Journalists

  • Pingback: Who, really, is The Associated Press accusing of copyright infringement? » Nieman Journalism Lab

  • Pingback: Google developing a micropayment platform and pitching newspapers: “‘Open’ need not mean free” » Nieman Journalism Lab

  • Pingback: Don’t forget: A few news orgs would still like to make aggregation opt-in » Nieman Journalism Lab

  • Pingback: The ASCAP example: How news organizations could liberate content, skip negotiations, and still get paid » Nieman Journalism Lab