HOME
          
LATEST STORY
Complicating the network: The year in social media research
ABOUT                    SUBSCRIBE
July 3, 2014, 10:05 a.m.
Audience & Social
LINK: www.pewinternet.org  ➚   |   Posted by: Caroline O'Donovan   |   July 3, 2014

Pew Research Center’s Internet & American Life Project is out with a report today based on a survey of leaders in information technology. Earlier, Pew asked experts what digital life would look like in 2025; today’s update focuses on potential threats to our information network.

The concerns are broken down into four categories:

1) Actions by nation-states to maintain security and political control will lead to more blocking, filtering, segmentation, and balkanization of the Internet.

2) Trust will evaporate in the wake of revelations about government and corporate surveillance and likely greater surveillance in the future.

3) Commercial pressures affecting everything from Internet architecture to the flow of information will endanger the open structure of online life.

4) Efforts to fix the TMI (too much information) problem might over-compensate and actually thwart content sharing.


The first two categories have broad implications for journalists — for example, their safety, and how they practice their craft. Journalists will simultaneously have to work to push back against government and commercial control creep while learning how to resist surveillance, as will everyone else.

The second two categories, however, speak more directly and immediately to the world of digital publishing. For example, as those commercial pressures mount, it will become increasingly important to make sure that news and information providers understand and have access to the heavy-duty tools of the Internet, so that rapidly consolidating power and money do not overwhelm them:

Glenn Edens, director of research in networking, security, and distributed systems at PARC, said, “Network operators’ desire to monetize their assets to the detriment of progress represents the biggest potential problem. Enabling content creators to easily and directly reach audiences, better search tools, better promotion mechanisms and curation tools — continuing to dismantle the ‘middle men’ is key.”

The fluidity of content was also a major concern for some, who look forward to considering access a right, and believe that sharing is the antidote to a fractured Internet:

Clark Sept, co-founder and principal of Business Place Strategies Inc., wrote, “Online content access and sharing will be even better and easier by way of personal digital rights access. Sharing freely will be recognized as having greater long-term economic value than strictly limited controls over ‘intellectual property.’”

Or, more briefly:

Jim Harper, director of information policy studies at the Cato Institute, responded, “People are going to get what they want, and they want to share content.”

If the risk of an information ecosystem in which content is produced and controlled by a few economically powerful players isn’t clear, Doc Searls explains:

“What the carriers actually want — badly — is to move television to the Net, and to define the Net in TV terms: as a place you go to buy content, as you do today with cable. For this they’ll run two-sided markets: on the supply side doing deals with “content providers” such as Hollywood and big publishers, and on the demand side by intermediating the sale of that content. This by far is the most serious threat to sharing information on the Net, because it undermines and sidelines the Net’s heterogeneous and distributed system for supporting everybody and everything, and biases the whole thing to favor a few vertically-integrated ‘content’ industries.”

But there’s a flip side to all that sharing and fluid content, says Mike Roberts, former ICANN leader:

“God knows what will happen to the poor authors. John Perry Barlow says ‘information wants to be free,’ which pursued to the ultimate, pauperizes the authors and diminishes society thereby. There has been recent active discussion of this question on the ICANN former director list.”

Despite these looming systemic threats, many respondents were concerned about a challenge we already face everyday — how to efficiently locate the information that we want, and how to guarantee that it will continue to be served to us. Writes Michael Starks, an information science professional:

“The challenge will be in separating the wheat from the chaff. Will people who can create, edit, judge, find and curate content for others become valued for those skills? If so — and if that value is reflected in the salaries those people receive — then highly networked populations will have greater access to better content as well as more content.”

According to the report’s authors, complaints about the sorting process of the future included but were not limited to: “algorithms often categorize people the wrong way and do not suit their needs; they do not change as people change; search algorithms are being written mostly by corporations with financial interests that could sway the ways in which they are being written; search algorithms can be gamed by certain outside interests to sway searches to their advantage.” Susan Etlinger of the Altimeter Group added additional concerns:

“With regard to content, the biggest technical challenge will continue to be filter failure; algorithms today just cannot keep up with the number and type of signals that provisionally predict what a person will want at a certain point in time. There are so many barriers: multiple devices, offline attribution and of course simple human changeability. We will continue to see a push and pull with regard to privacy. People will continue to adapt, but their expectations for control and relevance will also increase.”

A few survey respondents went so far as to imagine the kinds of systems they believe might, or at least should, come into place to help us deal with the deluge of data. Marc Rotenberg, president of the Electronic Privacy Information Center, expressed concerns over consolidated control of the tools we use to find information:

“Currently, approximately 70% of Internet users in the U.S. and 90% in Europe obtain information by going through the search services of one company. This needs to change. There should be many information sources, more distributed, and with less concentration of control. So, I am hoping for positive change. We need many more small and mid-size firms that are stable and enduring. The current model is to find an innovation with monetizing potential, incorporate, demonstrate proof of concept, sell to an Internet giant, and then walk away. This will not end well.”

What could one type of small firm helping redistribute control of search and distribution?

Jonathan Grudin, principal researcher at Microsoft Research, predicted, “To help people realize their fullest potential, an industry of ‘personal information trainers’—by analogy to personal trainers for fitness—will form to help people find and access information that is interesting and useful for them. Reference librarians played this role when we went to the information repository called a library. As the volume of information continues to grow exponentially, personal information trainers will help us with the much more daunting task of creating a virtual dashboard to access the information of value to us, much of which we did not know was out there.”

Ultimately, writers Robert Cannon, U.S. Internet law expert, we are past “the initial utopian introduction that greets the technology with claims of world peace,” and past the era of competition. “In the information era,” he writes “we have moved into the era of consolidation.” But there may yet be hope:

“Unlike other cycles where the era of consolidation also raised barriers to entry, in the modern information era, the barriers to entry still remain low. But this can change as conduit becomes entangled with content or service. […] This is the core concern of the Net neutrality debate: Will the Internet of the future look like the radio market or the telegraph market after consolidation, with few players controlling content — or will it continue to look like the never-ending marketplace of ideas?”

Show tags Show comments / Leave a comment

Last month, BuzzFeed’s executive editor for news Shani Hilton stopped by the Nieman Foundation, where the Nieman Fellows and I had the chance to ask her a few questions.

Hilton was just promoted to her current role this September, a position which makes her responsible for, among many other things, developing a set of newsroom standards for BuzzFeed’s ever growing staff. In addition to talking about that, we touched on hiring strategy, diversity, how you know when a new project isn’t working, international expansion, and more.

Our sister publication, Nieman Reports, has the highlights reel in text; for true devotees, the full audio of the interview is below.

Permalink

Earlier this year Spanish lawmakers passed a law requiring Google and other aggregators to pay local publishers for snippets or links to stories. As Europe continues to turn up the heat on Google, the company decided today to shut down Google News in Spain.

While it’s still uncertain how much companies like Google would have to pay every time an article appears, the penalty for not paying the fee is almost $750,000. That was apparently more than enough reason for Google to take its ball and go home. Richard Gingras, head of Google News writes:

This new legislation requires every Spanish publication to charge services like Google News for showing even the smallest snippet from their publications, whether they want to or not. As Google News itself makes no money (we do not show any advertising on the site) this new approach is simply not sustainable. So it’s with real sadness that on 16 December (before the new law comes into effect in January) we’ll remove Spanish publishers from Google News, and close Google News in Spain.

According to Mark Scott of The New York Times, Google plans to remove all Spanish publishers from its “global news aggregating products.” What effect Google’s decision will have on traffic for the Spanish news sites remains to be seen. As SEO consultant Adam Shrek’s recent analysis showed, the amount of traffic a site gets from Google News can vary.

All across Europe publishers have been demanding that Google start paying for content. Media companies in France, Spain, and Germany have led the fight, accusing the search company of becoming rich off copyrighted work from publishers. A similar law was passed in Germany, but rather than paying the fees outlined in the law Google gave publishers the choice to opt in to show up in search results. By opting in companies would waive their right to get paid. As Catherine Stupp wrote for the Lab earlier this month, there were immediate results:

To avoid paying the collection agency, VG Media, which represents the publishers that chose not to opt in, Google stopped showing snippets from their news articles on Oct. 23. Shortly after that, the publishers in VG Media gave Google a license to restore snippets to their search results — for free. Berlin-based Axel Springer, one of Europe’s largest publishers, announced on Nov. 5 that it had caved to Google’s pressure after traffic to its websites from Google dropped by 40 percent and from Google News by 80 percent when snippets were left out of search results.

Permalink
LINK: www.adamsherk.com  ➚   |   Posted by: Joshua Benton   |   December 9, 2014

There’s Google and then there’s Google News. One tries to soak up the entire Internet, the other a curated selection of news sites. It’s easy to confuse the two, since you’ll often get “Google News” results at the top of a standard Google search page even if you never go near the url news.google.com. But they’re distinct parts of Googleland.

Google and publishers have a fraught relationship, and plenty have given thought to what it would be like to pull out of one or both Google corpora. (Axel Springer found out.) But how important is each to your overall search traffic? Is it your site’s presence in Google that’s driving it, or its presence in Google News?

That’s the question asked in this interesting piece by SEO consultant Adam Sherk. He used a tool to try to determine how much of 80 news sites’ search traffic came from general search and how much came from Google News. The answer: It depends.

On the high end, you had Reuters and the Christian Science Monitor, which each get more than 40 percent of their search traffic from Google News — either from the Google News site itself or a search onebox. (Update: Sherk now says oneboxes aren’t included here, which means the real impact of Google News is understated here.)

adam-sherk-google-news-top

At the very bottom? BuzzFeed, with less than 1 percent coming from Google News.

adam-sherk-google-news-bottom

It’s hard to generalize too much from the data. The Christian Science Monitor, despite its somewhat old-fashioned reputation, is actually something of a SEO powerhouse, quite good at staying on top of Google Trends and posting webby copy that matches what people are searching for in the moment. It makes sense that Reuters, as a wire service, would do well for in-the-moment news searches. And that BuzzFeed’s search traffic comes overwhelmingly from the non-news side of Google makes sense, given its abundance of evergreen listicles.

But you also have sites like Mashable (4%) and Business Insider (5%) in the low-from-Google-News category, and Bloomberg Businessweek (29%) and Newsweek (19%) on the high-from-Google-News end — each of which is the opposite of what I would have expected. So it’s more complicated than it might seem. But the broad majority of sites seem to be in that 5 to 25 percent range — meaning Google News makes up a significant but not overwhelming part of most sites’ search traffic.

Check out Sherk’s post to see data on 80 major news sites — both raw totals and the News/non-News split.

Permalink

Recent media news headlines have briefly sucked the digital discourse around new and legacy media back into the reductive binary of pro- and anti-Internet.

While asking whether the Internet helps or hurts journalism is about as useful as asking if technology is good or bad, the Pew Research Internet Project does have a study out today that comes down pretty clearly on one side.

The survey of 1,066 internet users shows that 87% of online adults say the internet and cell phones have improved their ability to learn new things, including 53% who say it has improved this “a lot.” Internet users under age 50, those in higher income households, and those with higher educational attainment are especially likely to say the internet and cell phones help them “a lot” when it comes to learning new things.

Asked if they enjoy having so much information at their fingertips or if they feel overloaded, 72% of internet users report they like having so much information, while just 26% say they feel overloaded.

[…]

News: Substantial majorities also feel better informed about national news (75%), international news (74%), and pop culture (72%) because of these tools.

Not only do individual Americans feel more personally informed because of the Internet, but a majority also believe that society at large is better informed. Interestingly, survey respondents generally felt that the Internet improved their knowledge of distant topics — pop stars and international news — more than it increased their understanding of things like local news or civic issues. 60 percent of those surveyed said they felt better informed about local news after the Internet, while 74 percent and 75 percent felt mobile phones and the Internet made them better informed about international and national news, respectively.

Media news tends to focus on the national narrative — BuzzFeed versus The New York Times versus whoever’s spending millions of dollars to build a huge new website this week. But despite efforts of programs like the Knight Foundation’s Community Information Challenge, the tougher nut to crack for the Internet seems to be disseminating information on a more granular level.

Permalink

A new report out today from the Pew Research Center’s Journalism Project takes a look at how partnerships work in journalism by way of five case studies. Rick Edmonds and Amy Mitchell write about collaborations between Charlottesville Tomorrow and The Daily Progress; I-News Network, Rocky Mountain PBS, and KUSA-TV; five Texas newspapers; The Lens and WWNO Public Radio; and The Toronto Star and El Nuevo Herald. It’s worth noting that these examples include both nonprofit and commercial partnerships.

The report finds that, broadly, the majority of these partnerships are born out of economic necessity, and that, despite their increasing prevalence, they can be difficult to manage successfully. Interestingly, the authors say that many of these collaborations are easier to execute in legacy media — namely print and broadcast — than digitally, because of technological barriers such as incompatible content management systems.

Also of interest is the observation that few of the partnerships are financial in nature. For the most part, the goal is to work more efficiently, reach a broader audience, and tell a better story, rather than for one side or the other to increase revenue. For example, the Texas Front-Page Exchange has been sharing content gratis for five years now. From the report:

What stood in the way of this sort of cooperation for decades was industry prosperity, big newsroom budgets and a tradition whose definition of quality began with running only the work of your own staff along with wire stories.

But particularly after papers scaled back any statewide circulation ambitions as hard times set in, there came to be very little competition for audience among the five.

Other editors share Mong’s view that the cooperation, while not central to editorial strategy, is a distinct plus. Nancy Barnes came to the Chronicle in October 2013 after years at the Star Tribune of Minneapolis and began as a skeptic. “I was surprised—giving away all that content for free? But in fact these are all very distinct markets. The exchange helps us avoid redundant effort. It seems a very innovative solution.”

Permalink
 
Join the 15,000 who get the freshest future-of-journalism news in our daily email.
Complicating the network: The year in social media research
Journalist’s Resource sifts through the academic journals so you don’t have to. Here are 12 of the studies about social and digital media they found most interesting in 2014.
News in a remix-focused culture
“We have to stop thinking about how to leverage whatever hot social platform is making headlines and instead spend time understanding how communication is changing.”
Los Angeles is the content future
“Creative content people are frustrated with the industry and creating their content on their own terms. Sound familiar?”
What to read next
847
tweets
Here’s some remarkable new data on the power of chat apps like WhatsApp for sharing news stories
At least in certain contexts, WhatsApp is a truly major traffic driver — bigger even than Facebook. Should there be a WhatsApp button on your news site?
429What’s the right news experience on a phone? Stacy-Marie Ishmael and BuzzFeed are trying to figure it out
“Nobody has to read you. You have to earn that. You have to respect people’s attention.”
343Come work for Nieman Lab
We have an opening for a staff writer in our Cambridge newsroom.
These stories are our most popular on Twitter over the past 30 days.
See all our most recent pieces ➚
Encyclo is our encyclopedia of the future of news, chronicling the key players in journalism’s evolution.
Here are a few of the entries you’ll find in Encyclo.   Get the full Encyclo ➚
Crosscut
International Consortium of Investigative Journalists
Plaza Pública
California Watch
American Independent News Network
Grist
Newsmax
Poynter Institute
El Faro
Apple
GateHouse Media
Dallas Morning News