Twitter  High school journalism is about the process, not the product nie.mn/1oi6iYO  
Nieman Journalism Lab
Pushing to the future of journalism — A project of the Nieman Foundation at Harvard

GateHouse-NYT Co. deal: A bad precedent for the web

It’s going to take some time to think through the implications of the settlement (PDF link) announced today between the New York Times Co. and GateHouse Media, over the issue of NYT’s Boston.com site aggregating content from local sites belonging to GateHouse, but my first instinct is that it is almost unrelentingly bad. Why? Because while the settlement is not a legally-binding precedent — the one piece of what might be called good news — it still involves the New York Times voluntarily refraining from what many would argue is perfectly defensible behaviour. As Joshua Benton notes in his post here, that could well embolden other publications to launch similar cases, on the assumption that if the NYT caved then someone else might too.

The Times tries to argue that this settlement does nothing to change the way it approaches linking to or even quoting from external sources on its websites, but that clearly isn’t the case at all. It completely changes the way the paper does that, but only when the content involves a GateHouse website. The NYT claims that it will continue to link to and quote from external sources whenever it wants, but will no longer do so with GateHouse content (under the agreement it can continue to link, but can no longer aggregate content in an automated way, and has agreed not to quote from a GateHouse site).

This kind of dual status for linking and quoting is going to be virtually impossible to defend, I would argue. What possible rationale could the NYT create for taking one approach to GateHouse content and another to content from everywhere else? The only obvious reason is that one sued the company and the others haven’t. That’s an invitation to further court cases.

My biggest fear (and I don’t think I’m alone) is that every settlement like this one weakens the defences around the entire structure of the Web, in which linking and quoting — in some limited, representative way — is a fundamental principle. Not only that, but doing so is a right that is enshrined in the U.S. copyright principle of “fair use.” It’s true that there are all sorts of limits placed by the courts on that principle (although the simple fact that a site is run by a commercial entity is not a de facto exclusion from fair-use protection), but I would argue that it is still a vitally important principle, and one we shouldn’t be too quick to give up.

I recognize that the NYT has corporate responsibilities to consider, and that it probably didn’t want to engage in a protracted legal battle over this issue — particularly during tough economic times — but I think the agreement it has entered into is a major step backward for media and the Web.

                                   
What to read next
lippmann-house-990
Ann Marie Lipinski    July 24, 2014
The Nieman Foundation for Journalism at Harvard wants to hear your idea for making journalism better. Come spend a few weeks working on it in Cambridge.
  • http://www.our-hometown.com Stephen Larson

    Hey, what’s wrong with stopping the wholesale copying of headlines and first paragraphs of news stories? Most sites that do this, do it in an automated fashion and provide no value. Everything on their pages is the work of someone else.

    Look at http://www.topix.com, they produce nothing and yet they profit from the AdSense ads.

    Links to news stories are fair use but not copying the headline and first paragraph. Write your own take on a story and link your text to the story if you want.

  • http://www.mathewingram.com/ Mathew Ingram

    That’s what makes it such a contentious issue, Stephen. Plenty of sites would argue — Topix included, I’m sure — that the simple aggregation of links adds value.

    I think there is some merit to that argument, so long as the site doesn’t quote more than a small amount and links back to the originating site, which in turn produces traffic that the site can monetize.

    Your argument would make Google News and pretty much any other news aggregator illegal or impossible. Is that a good thing?

  • Pingback: Why you care about GateHouse vs the New York Times : Wicked Local Blog

  • Pingback: GateHouse-NYT deal: A bad precedent

  • http://www.our-hometown.com Stephen Larson

    @Mathew Ingram – would making “Google News and pretty much any other news aggregator illegal” without them getting explicit permission from the copyright holders be a good thing?

    Possibly for news suppliers but it should be a choice and the default should be that permission is not granted.

    Smaller aggregators like http://www.topix.com would have a hard time getting that permission for free without the default being permission is granted (as most aggregators profess robots.txt, etc. protocols grant). I say good riddance to http://www.topix.com, they likely are sucking the life out of AdSense CPC for news content.

    Google News probably would get permission without paying from most but maybe not.

    I use Google News myself for financial news. I tend to click on stories from the NY Times. If their stories were not on Google News, I likely would go to the business section of the NY Times website more frequently. It could be a reasonable business decision to not be on Google News if they won’t pay.

  • http://www.mathewingram.com/work Mathew Ingram

    But why should the default be that permission isn’t granted? The biggest issue with your proposal is that it wouldn’t just target Google News or Topix, but would affect anyone who links for any reason whatsoever — and there is no plausible way to make it apply to one site and not everyone, which is kind of my point about the GateHouse/NYT agreement.

    The reality is that Google News and likely Topix as well pay for that headline and lead paragraph (assuming they aren’t already covered by fair use) with traffic, which I expect both drive in fairly large numbers. Boston.com drove respectable amounts of traffic to GateHouse’s site too, but for some reason the company would rather forego that benefit.

    I’ll say it again: the agreement is a bad precedent for newspapers, and it’s bad for the Internet and online media as a whole. If the headline and lede paragraph are the entire value you are offering, you have more problems than a lawsuit can fix.

  • http://timwindsor.com Tim Windsor

    Isn’t the presence or lack or ads on the supposedly offending site a red herring? After all, it’s either a fair use or not. Fair use isn’t limited to not-for-profit publications.

    But I do agree with Matthew that we’re on a slippery slope here, with a potentially huge downside for the link economy.

  • http://www.our-hometown.com Stephen Larson

    @Mathew Ingram – If the headlines and lede paragraph has no value then you should have no objection to not using them.

  • http://www.mathewingram.com/work Mathew Ingram

    That’s a good point, Tim. Many blogs have Adsense as well — should they be forbidden from linking or quoting without permission too?

    Stephen, I’m not saying the headline and lede don’t have value — I’m just saying if that’s such a huge proportion of the value that you need to sue over it, then you have a problem.

  • http://www.our-hometown.com Stephen Larson

    @Mathew Ingram – ” Google News and likely Topix as well pay for that headline and lead paragraph .. with traffic”. And who decided traffic is enough. It is about choice, let the publisher choose.

  • http://www.mathewingram.com/work Mathew Ingram

    The publisher can choose what aggregators like Google News do — that’s what robots.txt is for. What they shouldn’t be able to do is tell everyone, blogger and mainstream aggregator alike, that they can’t use a headline or lede paragraph without permission.

  • http://www.our-hometown.com Stephen Larson

    @Mathew Ingram – “you need to sue over it”, and that is the problem for smaller players. At least this settlement makes it a little easier for publishers to contend that taking headlines and ledes without permission is not fair use. One small step for publishers.

  • http://www.our-hometown.com Stephen Larson

    @Mathew Ingram – try stopping http://www.topix.com with robots.txt, they don’t (or at least didn’t when we tried) publish the instruction (User Agent name) to do so. The default should be permission is not granted then only true fair use will permit copying.

  • http://www.mathewingram.com/work Mathew Ingram

    You’re making my point for me — it *is* fair use, or arguably should be. That’s why a settlement like this creates a bad precedent — because it encourages other publishers to think the same way, which is to argue that the NYT has effectively said it isn’t fair use and shouldn’t be allowed.

  • http://www.our-hometown.com Stephen Larson

    @Mathew Ingram – I guess that is it, you and I disagree. Use of headlines and ledes is not fair use and this settlement supports that.

  • http://techdirt.com/ Mike Masnick

    While agree almost 100% with Mathew, and almost 100% disagree with Stephen, it should be worth pointing out that both Topix and Google *have* in fact worked out agreements to pay (with money) the Associated Press. Moreover, another competitor, chose not to pay and was sued by the AP.

    Of course, the whole thing is ridiculous. If these sites don’t want the traffic, there are technological means to avoid it.

  • http://timwindsor.com Tim Windsor

    And just one final note. The source publisher doesn’t get to determine what is fair use. Fair use is determined by law and precedent.

    There’s no granting of permission or opting in or out involved. Unless copyright law is rewritten.

  • http://www.mathewingram.com/work Mathew Ingram

    Thanks, Mike — that’s a big reason why I think this kind of settlement is bad. You put all of these kinds of deals together and pretty soon people figure that the issue has been settled and fair use is no longer even a question — and that’s bad not just for Google or Topix or the NYT but for everyone and for the Web in general.

  • http://www.mathewingram.com/work Mathew Ingram

    That’s it exactly, Tim. Some publishers and content owners want to rewrite copyright law so that fair use effectively doesn’t exist, and everyone has to ask permission before they can use even the smallest amount for any purpose.

    That’s bad law, and it would make for a significantly less useful Internet as well — and likely wouldn’t even help the content owners who seem to think it’s their only hope.

  • http://www.our-hometown.com Stephen Larson

    No one is suggesting a rewrite copyright law, people simply disagree as to what fair use is. That is why it would have been better if this case went all the way. However, you can’t blame Gatehouse for settling, they got what they wanted.

  • http://spap-oop.blogspot.com Tish Grier

    The way I understand it is that this agreement is between these two parties as to avoid setting precedent for everyone else.

    Think of it this way though: what if this was a blogger who simply didn’t want to be aggregated by a particular site. I may have this objection if the site (news or not) that I was being aggregated on did not correspond with my personal or political beliefs. I could then say “hey! I don’t want your traffic because I don’t support your viewpoint!” and as a blogger it would be fine for me to ask that, and for me to expect the other guy to drop my feed. Hopefully, I wouldn’t have to bring it into the courts because it would be settled informally.

    At Placeblogger.com, when someone requests their feed be removed from our aggregator, we grant their request. No problem.

    but as I understand it, GateHouse just couldn’t say “hey, quite aggregating our feed!” and that the matter had to go to court.

    so, I don’t see it as setting precedent, but that it’s an agreement between two particular parties. and any other disputing parties will have to forge their own agreements. Maybe it will make a difference for aggregation sites, but that will be dependent on whether or not the parties involved aggree to the aggregation.

  • http://www.mathewingram.com/ Mathew Ingram

    It may not set a legal precedent, Trish, but I think it sets a precedent in practice, and that it’s going to encourage others to file lawsuits on the same basis.

    As far as aggregation goes, I think it’s very enlightened of you to remove feeds if someone doesn’t want them to be aggregated — but I don’t think you should *have* to, provided you aren’t reprinting the full text.

    As far as I’m concerned, RSS feeds are a content-distribution method, and they imply that the content is meant to be used — in accordance with copyright of course — and that includes fair use, quoting, etc. If you don’t want your content used, then don’t put out an RSS feed.

  • http://www.niemanlab.org Zach Seward

    Will have some more on this in the a.m., but I think you’re right, Mathew. In addition to the points you make, the case may have precedent qualities because a) other media companies could argue that GateHouse’s arguments prevailed here; and b) it begins to establish “common practice” for linking and aggregation, which courts would consider in deciding future cases with similar circumstances.

  • http://www.mathewingram.com/ Mathew Ingram

    Thanks, Zach — that’s my fear exactly.

  • http://ideas.typepad.com/webu Bill Dunphy

    This is an issue that is going to take some time to shake out, I think. And for once, I’m not sure where the answer lies.
    One of the things that worries me, and that I think is glossed over with the “fair use” argument is this hard thought:
    What if people really are reading shorter and shorter snatches of stories – as more than a few studies have demonstrated. If all people want from most stories IS the nut graph – and sites are free to copy that and republish it themselves – then that use seems less and less fair.
    I get that one of our most important jobs these days is filtering, curating the infoflow, but I’m more than a little afraid about what a future ruled by this approach will look like.

  • http://blog.syracuse.com/newstracker Brian Cubbison

    It’s possible for a case like this to be stretched into a bad precedent, and that’s the fear here, but the idea of fair use implies the possibility of unfair use.

    The case was not about the links that are the backbone of the web. It wasn’t about the blogger who links to interesting sources, or the blogger who quotes a headline and a few paragraphs from the local newspaper then riffs on the story. It’s not even about the blogger who starts The Bedford Falls Watchdog and includes a feed from the local newspaper in a rail on the side of the blog. And it’s not really about search, although that was allowed to muddy the settlement.

    All those uses do indeed work on the principle of sending traffic through the link. But when aggregation comes to selling ads against content, there’s the possibility for exploitation. It would be possible to create a Neo York Times out of feeds and reblogged content from the real Times and sell ads of your own against that. More likely, it would be Potter Media Corp. creating a hyperlocal site off the work of local bloggers. It’s possible to aggregate just enough to make it not worth clicking through to the original content and ads. Unfair use suggests very little of your own work, riding on someone’s else work, for your profit and not theirs.

    That’s too close to link farms, spam blogs and hot-linked images to to be considered in the generous spirit of the link economy. The ethos of the link economy should be supportive and not exploitative. It shouldn’t take a lawsuit, but there are times when it’s reasonable for someone to say: Back off, be fair, you’re using too much of my work, and come to an understanding.

    There’s something else at work here. It’s the separation of reporting from the bundling of news. Traditionally, it’s been a vertical organization: Reporters turn in their news, which goes down the assembly line through various bundlers (editors, layout artists, printers), all working for the same organization. We’re starting to see a horizontal process. Scattered reporters do their reporting, scattering it to the winds by feeds and streams, then unrelated bundlers like Daylife or Alltop design aggregators as they see fit. The trouble is, the reporters for now are still working for other aggregators, who bear the overhead.

    We’d like to think there’s value in the original reporting, but unless the reporter can watermark a fact and sell an ad that travels with that fact where ever it goes, the aggregator will be the middle man who makes the money. Why not, I guess, if the aggregator has created an elegant design that delivers the news you want. Should we all go out and sell ads against the Times’ reporting, so that it becomes a competition for the best interface design?

    The questions arise when the link economy transitions from sharing in the conversation (and a link is a generous thing to offer someone else) to selling ads against content (and the middleman is grabbing the lapels of your customers before they get to your link).

  • Oz

    Honestly, who wouldn’t want the NY Times linking to your site? I mean, what, does Gatehouse have something against short spikes in incoming traffic? Link to me, NYT. Take my headlines. Run everything I write and link back to me – please!

  • Pingback: The Gatehouse Settlement | PlagiarismToday

  • http://blog-me-no-blogs.blogspot.com/ cosanostradamus

    .
    I think you may be right, Mr. Ingram. This could become a Napster situation. Because copyright law requires you to defend your property, just as real property law does (lest you lose it), there will have to be a definitive decision on this at some point.

    It’s good that it didn’t happen now. I’d like to see the current generation of senior judges die off first. I don’t think most of them would understand the technology or the ethos of computers, the Internet and blogging in particular. Not sure that the next generation will be any less corporatist, but at least they’ll know how to turn a computer on.

    The bottom line is, the Internet belongs to the taxpayer. Big corporations can go on using it for free, but not at the expense of our freedoms. The advertising-supported model that has turned radio and TV into a vast wasteland cannot be allowed to do the same to the ‘Net. It’s a public place, not a private market. We have a right to read, review, discuss, debate, critique, quote and link to anything anybody puts in our public square. Let’s keep creeping privatization out of our public discourse.

    And while we’re at it, let’s find a better model that provides for direct support of content creators by consumers, without a middleman soaking up most of the gravy, who then uses the money for lawyers to try to limit the rights creators and consumers.
    .

  • http://www.topix.com Chris Tolles

    I love it when people like some IT vendor like Stephen Larson make broad statements about the value of Topix and value.

    Since we’ve had over 10,000 publishers *ask* to be included in Topix (dwarfing the folks like gatehouse which asked to be removed), it is pretty clear that there are peoplel who recognize that being included on CNN’s front page of local news or Mapquest’s local news headlines, or myAOL or ESPN’s local sports stories is useful (since we are powering all of these guys).

    Just because the IT vendor can’t see the value doesn’t mean it doesn’t exist :-)

    This decision just reasserts the status quo – people can link to you, and if you really get bent out of shape, you can block them from crawling you.

    Clearly no one *really* wants to go to court, since we don’t know what will happen.

    Obviously, the web is built off of “opt out” linking rules, and if Stephen Larson thinks he’s going to get “opt in” copyright rules on the web, he’s likely to go out of business before *that* happens, because no sane person is not going to settle first…

    Mind you, the real issue is that most local areas don’t have enough coverage — we’ve grown the commentary to supply news in areas which don;t have that coverage, such that most new subjects on Topix are user generated now. 80% of the pageviews on Topix are on the commentary, 75% the threads coming in without a referring article.

    So even if people like Stephen get their way, we’ll do just fine, thank you.

  • http://www.daylife.com Upendra Shardanand

    If you read the claim (http://www.citmedialaw.org/sites/citmedialaw.org/files/2008-12-22-Gatehouse%20Media%20Complaint.pdf) this seems like an edge case outside the realm of the normal polite opt-in/opt-out issues. This was a contentious case between two direct rivals. For example:

    1) Boston launched a direct competitor to WickedLocal, to the point of using the same positioning language
    2) Gatehouse asked Boston to stop indexing, several times. Boston refused.
    3) Gatehouse tried to block the IP of Boston’s crawler. Boston found a work-around and continued to crawl.

    .. etc. It’s worth reading the claim to get the flavor.

    [And to echo Chris' comments - at Daylife, we get daily publisher requests to *be* indexed, and requests to be more heavily promoted. Opt-outs are nearly non-existent. Market data indicates Gatehouse is the exception. Especially when Newscorp, Timewarner, Gannett, WPNI, and nearly every other major media company practices aggregation in big and small ways.]

  • Pingback: Der BetaBlog im temporären Exil auf Hundertfünfzig Worte

  • Pingback: GateHouse-NYT Co. Deal: A Bad Precedent for the Web » Nieman Journalism Lab » Pushing to the Future of Journalism « Predicate, LLC | Editorial + Content Strategy

  • Cog

    Topix.com is a nightmare to get content removed from.