As the newspaper industry has grown weaker and weaker, there has been a steady stream of articles and blog posts blaming Google for some or all of this decline. I’m not going to link to them all, because there are simply too many, and they are easy enough to find. The standard allegation is that the search engine, and other similar engines such as Yahoo and MSN, hijack readers by aggregating content, and then monetize those eyeballs by posting ads near the content. Newspapers get traffic, but Google critics argue that this traffic is essentially worthless — or at least can’t make up for the value that Google has siphoned off.
One of the most recent articles to take this tack appeared in the Guardian and quoted Sly Bailey, the chief executive office of newspaper publisher Trinity Mirror. Among other things, Ms. Bailey said that:
“By creating gargantuan national newspaper websites designed to harness users by the tens of millions, by performing well on search engines like Google, we have eroded the value of news. News has become ubiquitous. Completely commoditised. Without value to anyone.”
This argument is almost too absurd to be taken seriously. In a nutshell, Ms. Bailey is claiming that by expanding their readership and making it easier for people to find their content, newspapers have shot themselves in the foot, and should do their best to avoid being found by new readers. It’s particularly ironic that the Mirror CEO is making these comments in a story in The Guardian, which has built up an impressive readership outside the UK thanks to its excellent content.
Blogger and search scientist Daniel Tunkelang wrote a positive response to Ms. Bailey’s comments, saying he agreed with her that newspaper content was being devalued by Google — a theme that also emerged recently in an interview with the publisher of the Wall Street Journal, in which he said that “Google devalues everything it touches.” And then Daniel throws out a proposal: What if newspapers simply used the robots.txt file (a way of telling search engines which content to index) to block Google. If they all did it, wouldn’t that make it easier for them to monetize their content, instead of losing that value to the search engine?
Reading an earlier post in which Daniel and Mike Masnick debated the “Google devalues everything it touches” statement, it’s fairly obvious that Tunkelang has thought long and hard about search and how it works, and how to make more relevant content available to users. But I still think that he and publishers like Sly Bailey are completely wrong in how they think about what Google is doing. If there were a finite market for news and information, then the search engine could be accused of devaluing it — but that’s not how information works. In fact, oceans of interchangeable news make certain kinds of content even more valuable, not less.
The reality (as I have said over and over) is that if a newspaper or media outlet finds its business model severely impacted by the fact that Google excerpts a single paragraph of a news story, then it deserves to fail. If the value that you are providing for readers consists of a snappy headline and a 200-word arrangement of facts that can be easily duplicated, then you are in the wrong business. And if you are adding more value through context and analysis, then there are many more ways to monetize that than by slapping simple banner or text ads on it — which seems to be the only thing that Daniel and others can imagine newspapers doing.
But if you are actually adding value, wouldn’t you like as many people to find out about it as possible? Cutting yourself off from the world’s largest search engine is like cutting off your nose to spite your face. Why not just stop publishing on the Internet at all, and just go back to delivering content in one format to people, whether they want it or not? It’s absurd. The bottom line is that newspapers need to think about what kind of value they are adding and focus on that, instead of trying to either beat Google at its own game or pretend that it doesn’t exist.