Nieman Foundation at Harvard
HOME
          
LATEST STORY
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
ABOUT                    SUBSCRIBE
May 6, 2011, 2 p.m.

Krishna Bharat on the evolution of Google News and the many virtues of “trusting in the algorithm”

In a blog entry just posted to the Google News blog, Google News’ founder, Krishna Bharat, offers a fascinating comparison between Google’s “coverage” of 9/11 and the news it provided of Osama bin Laden’s death earlier this week. “We have certainly come a long way in the last decade,” he notes. “Indeed, Google News now has over 70 editions in over 30 languages, and sends over 1 billion clicks a month to news publishers worldwide.” (For the micro-mechanical view of this evolution, check out Danny Sullivan’s wonderfully detailed take over at Search Engine Land.)

Just as search overall has evolved to serve an online ecosystem that, in turn, has evolved from it…Google News has both changed, and responded to the changes within, the online news environment. In terms of the system’s user interface, that’s meant the integration of video and images and social media, and of experiments like Living Stories and Editors’ Picks. “The world has become much more complex now,” Bharat says; Google News now has “a lot more sources,” with more social signals coming in. I spoke with Bharat this afternoon to learn a little more about the insights that have informed Google News’ evolution since Bharat founded it in 2002.

Speed, and responding to it

The world of online news, first of all, is much more complex than it was when Google News was nascent. “News moves much faster,” Bharat told me, and “everybody’s in a race to get it out there.” And that’s in large part because we move much faster. With the new tools available to us, we’re in constant conversation — and, in some cases, competition — with each other. That’s a good thing for the cause of information distribution — consumers get more news, faster — but it also presents a challenge for the people, including the people at Google News, who want to serve us that information. The biggest challenge, Bharat notes, for a system premised on the authority of sharing: “Fresh news lacks hyperlinks.” As he puts it in his post, “Google’s ranking depended on links from other authors on the web. Fresh news, by definition, was too fresh to accumulate such links.”

Google News’ original algorithm accounted for that problem — and its evolution since then has continued to account for it through the feature’s inclusion of, among other things, social media updates. And that’s increased not only Google News’ reliability as a source for real-time information, but also its reliability as a source of that information. A key insight of Google News, Bharat notes, is that “many perspectives coming together” can be much more educational than singular points of view. “It’s a highly connected world, and news fundamentally allows us to understand this complex world — and navigate it.”

The human algorithm

I asked Bharat about Google News’ experience with human aggregation as well as algorithmic. “Over time,” he replied, “I realized that there is value in the fact that individual editors have a point of view. If everybody tried to be super-objective, then you’d get a watered-down, bland discussion,” he notes. But “you actually benefit from the fact that each publication has its own style, it has its own point of view, and it can articulate a point of view very strongly.” Provided that perspective can be balanced with another — one that, basically, speaks for another audience — that kind of algorithmic objectivity allows for a more nuanced take on news stories than you’d get from individual editors trying, individually, to strike a balance. “You really want the most articulate and passionate people arguing both sides of the equation,” Bharat says. Then, technology can step in to smooth out the edges and locate consensus. “From the synthesis of diametrically opposing points of view,” in other words, “you can get a better experience than requiring each of them to provide a completely balanced viewpoint.”

“That is the opportunity that having an objective, algorithmic intermediary provides you,” Bharat says. “If you trust the algorithm to do a fair job and really share these viewpoints, then you can allow these viewpoints to be quite biased if they want to be.”

Do what you do best…

Under that logic, furthermore, Bharat notes, those viewpoints — which is to say, individual news articles — don’t necessarily need to be complete. While, in the past, articles had to stand on their own — the lede, the narrative, the context, all wrapped together in a one-stop destination for readers — the digital ecosystem allows individual news stories more narrative freedom to be short or long, narrow or broad. Trusting in the algorithm means trusting in the tacit completeness of the automation it offers to readers. It means, for news organizations, that informing readers need not mean “owning” stories’ coverage. It may not even mean completely covering those stories. It means, rather, adding your own valuable contribution, and assuming that others will add theirs. [Insert your preferred Jeff Jarvis quote here.]

“We see ourselves as the Yellow Pages,” Bharat says of Google News. It’s a third party, a connector within an ecosystem. Bharat sees three primary layers in that system. Publishers produce articles; those articles get picked up by people and search engines and aggregators; they get discovered by other people — who, in turn, tweet about them and blog about them and otherwise extend their lifespan and impact. And all three of those broad components, he notes, “are hugely interdependent.”

That’s not only a conciliatory note for news organizations that have alternately feared and resented Google News’ role as an aggregator; it’s also a recognition of the mutuality of content and distribution in a digital world. “This is an ecosystem that has many players,” Bharat says. And “they need each other.”

POSTED     May 6, 2011, 2 p.m.
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
Within days of visiting the pages — and without commenting on, liking, or following any of the material — Facebook’s algorithm recommended reams of other AI-generated content.
What journalists and independent creators can learn from each other
“The question is not about the topics but how you approach the topics.”
Deepfake detection improves when using algorithms that are more aware of demographic diversity
“Our research addresses deepfake detection algorithms’ fairness, rather than just attempting to balance the data. It offers a new approach to algorithm design that considers demographic fairness as a core aspect.”