Nieman Foundation at Harvard
HOME
          
LATEST STORY
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
ABOUT                    SUBSCRIBE
Aug. 19, 2010, 4 p.m.

The web dies, the hype lives: What Wired left out of its eulogy

Maybe you heard: The web has been declared dead, and everybody’s mad about it.

I’ll get to checking the web’s vital signs in a moment, but one thing is clear: The hype and hucksterism of packaging, promoting, and presenting magazine articles is very much alive. I found Chris Anderson’s Wired article and Michael Wolff’s sidebar pretty nuanced and consistently interesting, which made for an awkward fit with the blaring headlines and full-bore PR push.

But looking past this annoyance, Anderson’s article makes a number of solid points — some I hadn’t thought of and some that are useful reminders of how much things have changed in the past few years. (For further reading, The Atlantic’s Alexis Madrigal has a terrific take on why the model of continuous technological revolution and replacement isn’t really correct and doesn’t serve us well, and Boing Boing nails why the graphic included in the Wired package is misleading.)

Still, Anderson almost lost me at hello. Yes, I like to use my iPad for email — and I frequently check out Facebook, Twitter, and The New York Times on it. But for the latter three, I don’t use apps but the browser itself (in my case, AtomicWeb). As I’ve written before, so far the iPad’s killer app is the browser — more specifically, the chance to have a speedy, readable web experience that doesn’t require you to peer at a tiny screen or sit down in front of a laptop or desktop. So going by Anderson’s own opening examples, the web isn’t dead for me — better to say that apps are in the NICU.

But I couldn’t argue with this: “Over the past few years, one of the most important shifts in the digital world has been the move from the wide-open web to semi-closed platforms that use the Internet for transport but not the browser for display.” That’s absolutely correct, as is Anderson’s observation that this many-platform state of affairs is “the world that consumers are increasingly choosing, not because they’re rejecting the idea of the web but because these dedicated platforms often just work better or fit better into their lives (the screen comes to them, they don’t have to go to the screen).”

That not-going-to-the-screen is critical, and — again — a big reason that the iPad has been a hit. But as my iPad habits show, that doesn’t necessarily imply a substitution of apps for the web. Nor, as Anderson himself notes, are such substitutions really a rejection of the web. It would have been less compelling but more accurate to say that the web isn’t dying but being joined by a lot of other contact points between the user and the sea of digital information, with points emerging for different settings, situations, and times of day. Sometimes a contact point is a different presentation of the web, and sometimes it’s something else entirely.

Do users care? Should they?

It’s also interesting to ask whether users of various devices care — and whether they should. Anderson brings up push technology and, with it, PointCast, a name that made me shudder reflexively. A long time ago, WSJ.com (like most every media company of the time) became infatuated with push, going as far as to appoint a full-time editor for it. It was tedious and horrible, a technology in search of an audience, and our entire newsroom was thrilled when the spell was broken and the damn thing went away. But Anderson notes that while PointCast didn’t work, push sure did. Push is now so ubiquitous that we only notice its absence: When I’m outside the U.S. and have to turn off push notifications to my phone, I have the same in-limbo feeling I used to get when I was away from my computer for a couple of days.

The problem with the first incarnation of push was that the only contact point was the computer screen, meaning information often wasn’t pushed close enough to you, or was being pushed down the same pipe you were trying to use for something else. Now, information is pushed to the web — and to smartphones and tablets and game consoles and social networks and everything else — and push has vanished into the fabric of How Things Are.

Generally, I think the same is true of the web vs. other methods of digital interaction — which is why the over-hyped delivery of the Wired article seemed so unfortunate. There isn’t a zero-sum game between the web and other ways of presenting information to customers — they all have their role in consumers’ lives, and increasingly form a spectrum to be tapped into as people choose. Even if apps and other methods of accessing and presenting that information take more parts of that spectrum away from the open web, I doubt content companies, telcos, or anybody else will kill the open web or even do it much damage.

The dogma of the open web

Frankly, both Anderson and Wolff do a good job of showing how adherence to the idea of the open web has calcified into dogma. Before the iPad appeared, there was a lot of chatter about closed systems that I found elitist and tiresome, with people who ought to know better dismissing those who don’t want to tinker with settings or create content as fools or sheep. Near the end of his article, Anderson seems to briefly fall into this same trap, writing that “an entire generation has grown up in front of a browser. The exploration of a new world has turned into business as usual. We get the web. It’s part of our life. And we just want to use the services that make our life better. Our appetite for discovery slows as our familiarity with the status quo grows. Blame human nature. As much as we intellectually appreciate openness, at the end of the day we favor the easiest path.”

That’s smart, except for the “blame human nature” part. Of course we favor the easiest path. The easiest path to doing something you want to do has a lot to recommend it — particularly if it’s something you do every day! I’m writing this blog post — creating something — using open web tools. Since this post is getting kinda long, I might prefer to read it on my iPad, closed system and all. The two co-exist perfectly happily. Ultimately, the web, mobile and otherwise, else will blend in consumers’ minds, with the distinction between the web and other ways of accessing digital information of interest only to those who remember when such distinctions mattered and/or who have to dig into systems’ technological guts. There’s nothing wrong with that blending at all — frankly, it would be a little disappointing if we stayed so technologically silo’ed that these things remained separate.

Even if “big content” flows through delivery methods that are less open and more controlled, anybody with bandwidth will still be able to create marvelous things on the open web using an amazing selection of free tools. As various technological kinks are worked out, traffic and attention will flow seamlessly among the various ways of accessing digital information. And social search and discovery will increasingly counteract industrial search and discovery, providing alternate ways of finding and sharing content through algorithms that reward popularity and scale. People who create good content (as well as a lot of content that’s ephemeral but amusing or diverting) will still find themselves with an audience, ensuring a steady flow of unlikely YouTube hits, Twitter phenomena, and hot blogs. The web isn’t dead — it’s just finding its niche. But that niche is pretty huge. The web will remain vigorous and important, while apps and mobile notifications and social networks grow in importance alongside it.

Top image by krossbow; iPad image by Kominyetska. Both used under a Creative Commons license.

POSTED     Aug. 19, 2010, 4 p.m.
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
Within days of visiting the pages — and without commenting on, liking, or following any of the material — Facebook’s algorithm recommended reams of other AI-generated content.
What journalists and independent creators can learn from each other
“The question is not about the topics but how you approach the topics.”
Deepfake detection improves when using algorithms that are more aware of demographic diversity
“Our research addresses deepfake detection algorithms’ fairness, rather than just attempting to balance the data. It offers a new approach to algorithm design that considers demographic fairness as a core aspect.”