Nieman Foundation at Harvard
Don’t click this: When should news organizations use “nofollow” links?
ABOUT                    SUBSCRIBE
May 26, 2010, 4:41 p.m.

Today’s Facebook changes and Zuckerberg’s law

So Mark Zuckerberg just announced the privacy-setting changes Facebook will begin rolling out. They’re about what you’d expect, given what Zuckerberg’s been saying over the past week: increased transparency, increased simplicity, etc., etc. One thing I didn’t expect, though: the fact that the description of the changes would have a theme.

See if you can spot it. I’ll help:

The number one thing we’ve heard is that there just needs to be a simpler way to control your information. We’ve always offered a lot of controls, but if you find them too hard to use then you won’t feel like you have control. Unless you feel in control, then you won’t be comfortable sharing and our service will be less useful for you.

That’s the tip of the (Zucker?)berg: The word “control” comes up 22 times in Zuck’s privacy-changes announcement (a post entitled “Making Control Simple”). And the word comes up 16 times in the section of Facebook’s site launched this afternoon, the “privacy page” dedicated to simple (and, really, simplistic) explanation of the site’s new approach to privacy settings.

That page’s title? “Controlling How You Share.”

So: control! Message received! Power over our information, back in our hands! If danah boyd is right — if privacy is, fundamentally, about control — then today’s Facebook changes vindicate that. The updates — and the apologetic and only occasionally defensive language Zuckerberg uses to describe them — are a nod to the power of Us, the aggregate: The backlash built; we rebelled; Facebook changed its ways to appease us. We fought against Zuckerberg’s law; Facebook let us win. Today’s changes suggest the company’s realization that our personal information is, for most of us, not a communal good so much as a highly proprietary one. Our information. Which we want to — and, more to the point, should — control.

What the control metaphor really acknowledges, though, is our new understanding of “sharing” itself: as an act that is, implictly if ironically, self-interested. On the web, there’s a performative aspect to self-revelation; as Steven Berlin Johnson put it in his recent Time magazine essay, “we curate our private lives for public exposure.” What Facebook had underestimated, in its latest round of information-unharnessing, is the extent to which we see our information not just as extensions of ourselves, but also — and more so — as cultivated representations of ourselves. As performances. As products.

In publishing, the traditional version, there’s always been a kind of motivational tension between the individual and the communal. And we’ve navigated it by enforcing a distinction between publication and publicity — with publication generally representing the selfless side of sharing, and publicity generally representing the selfish. Publication, the relatively passive act, serves civic ends, we assumed; publicity, the relatively active one, serves our own. Within that framing, publicizing our work — which was only a short leap from publicizing ourselves — became an act of arrogance. “Self-promoter” took on the air of epithet.

But the new ubiquity of sharing on the web means that, increasingly, publication subsumes publicity. The two are now wrapped up in each other, implicatively; selflessness and selfishness coexist in the content we share. Which means that, for us, publishing our photos, or our updates, or our ‘likes,’ or whatever else on Facebook is not just a communal act, but also an individual one. And the content that sharing produces is not just personal, but proprietary. Something that we should be able to — yes, Mark Zuckerberg — control.

Image by Dave McClure used under a Creative Commons license.

POSTED     May 26, 2010, 4:41 p.m.
Join the 50,000 who get the freshest future-of-journalism news in our daily email.
Don’t click this: When should news organizations use “nofollow” links?
Plus, a new free course for online fact-checking taught via workspace app Notion.
One potential route to flagging fake news at scale: Linguistic analysis
It’s not perfect, but legitimate and faked news articles use language differently in ways that can be detected algorithmically: “On average, fake news articles use more expressions that are common in hate speech, as well as words related to sex, death, and anxiety.”
Finally, Instagram is getting fact-checked (in a limited way and just in the U.S., for now)
“The potential to prevent harm is high here, particularly with the widespread existence of health misinformation on the platform.”