Nieman Foundation at Harvard
HOME
          
LATEST STORY
Newsweek is making generative AI a fixture in its newsroom
ABOUT                    SUBSCRIBE
Jan. 14, 2009, 8:28 a.m.

David Ardia: Why news orgs can police comments and not get sued

I wish every managing editor in the country could see this 20-minute video. I’ve heard so many misconceptions over the years about news organizations’ legal ability to police, manage, or otherwise edit the comments left on their web sites. They say “the lawyers” tell them they can’t edit out an obscenity or remove a rude or abusive post without bringing massive legal liability upon themselves — and that the only solutions are to either have a Wild West, anything-goes comments policy or to not have comments in the first place.

That’s not true, and hasn’t been true since 1996. This is a video of David Ardia, at a conference of New England newspaper editors we both spoke at a few weeks ago. David is the director of the Citizen Media Law Project here at Harvard; he’s no wooly-haired Internet radical — he’s former assistant counsel for The Washington Post. Here he explains Section 230 of the Communications Decency Act (CDA 230 to its friends) and how it provides wide-ranging immunity to web-site publishers for what goes on in their comments. It’s not quite 100-percent immunity, as David explains, but it’s awfully close — so long as you’re not forcing your users into defamatory statements or materially changing the meaning of comments (say, adding the word “not” where it doesn’t belong), you’re immune from liability.

But don’t believe me, since in Internet shorthand, IANAL (I am not a lawyer); luckily, David is. For those who’d rather read than listen, a full transcript of his speech is below.

David Ardia:

So I am going to talk about a major piece of federal legislation that most of us refer to as CDA 230. It’s Section 230 of the Communications Decency Act.

Before I get into the details of the legislation, though, I want to give you a little bit of background about publisher and distributor liability, because it’s important to sort of understand where the law was before Congress in 1996 passed this act. And as most of you sitting in this room know, under certain sort of traditional liability theories, a publisher is responsible for everything they publish.

That means you’re responsible for letters to the editor in your paper. You’re responsible for the things you write. You’re responsible for the material that freelancers submit to you and that you publish.

There’s another form of liability that has developed along with publisher liability called distributor liability. And distributor liability applies to things like newsstands and libraries, where courts basically take the position that they’re not in a position to know everything that they’re making available to the public, so we’re not going to hold them liable.

If you were to think about what the world would look like if courts were to hold them liable, you’d have to have newsstands read everything that they sell, and it’s unlikely that they’d be able to stay in business doing that. It’s also unlikely that they would publish a lot of controversial content.

So those two sort of parallel tracks of liability developed over time, and when the Internet in the ’90s began to become a popular publishing platform, sort of two major players early on were involved in the two big cases that led to this legislation.

The first was CompuServe. I don’t know if many of you remember, this was sort of the geek’s version of America Online. It was a pretty free-wheeling place, and they ran a forum which was actually called Rumorville — probably not the best name for something if you want to avoid getting sued — which actually reported on the news business, and it was an opportunity for journalists to swap rumors.

They were sued in 1991 based on a defamatory, allegedly defamatory statement in the Rumorville forum, and CompuServe argued — successfully at the time — that it should be viewed simply as a distributor. It was like a newsstand or a library. It wasn’t in a position to review everything that was put up on its servers, and it shouldn’t be responsible for what is in this forum.

A court in New York agreed and said, “We’re not going to hold you liable under the theory of distributor liability.”

Four years later, a case involving Prodigy, which I think at the time was an IBM venture, Prodigy had held itself out to be a family-friendly network — in a sense trying to differentiate itself in the market from CompuServe — and saying “you come to us and we’ve got wholesome good fare on our site.” And they did a lot of policing of content. They would go in and remove stuff they found was offensive or didn’t meet their civility requirements. They were sued for a post in one of their forums. They also argued, as CompuServe had four years earlier, that they should be viewed as a distributor and not a publisher.

The court looked at what they were doing and said, “No, you actually get involved a little bit with the content. Because you’re holding yourself out as a family-friendly network, we are going to view you as a publisher. We’re going to hold you to the same standards that newspapers are held to, that a broadcast station is held to, and we are not going to let you avoid liability.” This was in 1995.

And what we had under sort of common-law principles of liability is a really perverse system, that said that if you want to do good things, be a good Samaritan, if you want to clean up the content on your site if you are publishing online, you are going to be taking on liability. If you just simply allow people to say whatever they want and you play no role at all in screening the content, then you are going to be free from liability.

So in 1996, Congress stepped in and said, “We think that that’s a bad incentive for online publishers.” So the Communications Decency Act came about in 1996. Vast portions of it were ultimately declared unconstitutional because they related to restrictions on speech around pornography and other obscene or obscenity-like speech. But the provision that lived on is this Section 230.

And actually the title of the section is the Protection of Good Samaritan Blocking and Screening of Offensive Content. So you can see where Congress was intending this to reach. And the language is pretty simple. I’ll read it to you right now and then we’ll talk about how this actually applies to what you do. Basically Congress said that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

So, basically what that said is: We’re going to hold you to distributor standards, not to publisher standards, for things that you publish online. And courts have taken that provision and essentially run with it. So, in listening to that language, the obvious question is: What is an interactive service provider? Is that some kind of a special website? And the answer to that is no. For purposes of interpretations of this statute, it’s any means of communication that involves multiple users accessing a computer system. So that’s forums, it’s comments to blog posts, it’s comments to articles. It also could be listservs, email lists that you may use or send out. All of those are covered. Anything that involves multiple users, multiple recipients, falls under this protection.

As a result of this, Section 230 of the Communications Decency Act, Internet publishers are treated differently from publishers in print, broadcast and radio, television and radio — and you could argue that that’s terribly unfair. Many people have. That argument has not carried a lot of sway in the courts. And you might say that that protection should be provided offline as well as online. That argument has also been made. Congress has not been receptive to it.

But the one thing that’s clear from this act is that it does not immunize the original creator of content itself. So the author of the defamatory statement if they’re a user and they post a comment on one of your articles, they can still be held liable. It’s just the website operator that has immunity under this provision and because of that, you still have issues that you may face, for example around subpoenas, because there is litigation that could ensue based on a defamatory comment or a comment that invades someone’s privacy. So there are other things to be concerned about, even if you don’t have liability for the comments.

So, what kinds of lawsuits are covered under this provision? It’s most frequently been used to bar defamation and defamation-like claims, but it doesn’t just apply to defamation claims. It applies to any state tort claims — so that includes privacy claims, invasion of privacy claims, publication of private facts, rights of publicity — depending on how states view that; most states view that as a tort-like claim although there are a small number of states that view that as a property, sort of intellectual-property type claim — negligence claims. Most of the garden variety of state common-law torts — all torts, but state common-law claims are covered by this.

It explicitly exempts, however, from its coverage criminal law, communications privacy law — those are things like wiretapping laws — and intellectual property claims. So Rob [Bertsche] who is going to talk a little bit later about copyright claims, those are not covered under this act. So you do not have immunity for copyright claims under the Communications Decency Act — although there is immunity provisions under a law called the Digital Millennium Copyright Act, which Rob is going to talk about.

So what kinds of activities are covered? What can you do and what things should you be concerned about if you engage in them online that may lose your protections? So the first thing is a screening of content prior to publication is clearly covered under CDA 230. This is the quintessential activity that the law was passed to address. This means that if you engage in traditional editorial functions — such as deciding what to allow on your site, whether you remove foul language or otherwise edit content, as long as you don’t materially change its meaning — you’re going to be covered under the act. For probably 99 percent of the things you do on a daily basis or would like to do, those activities are going to be covered.

If you solicit or encourage users to submit content, you’re going to be covered. So think about a situation where you, for example, you take the position that our site is going to be a civil, adult conversation — that we’re not going to allow the riffraff to come in here and say all kinds of things. But some of that riffraff slips through, and you end up with a few bad comments on your site. That’s not going to change the liability for you, you’re still going to be immune.

An argument was made against an online site called ibrattleboro.com in Brownsborough, Vermont, where it was the first case in Vermont to raise the CDA 230 protections for an online news site. And the plaintiff in that case argued that they were trying to have a clean discussion here.

All of their terms and conditions said that they were going to carefully police the site. They had bold warnings on pages of the site saying: “We’re going to hold you to behaving civilly.” Somebody posted a statement on there that was arguably defamatory, and the plaintiff said: “Well, you made all these promises to your users and you didn’t fulfill them. You should be liable.”

The court said: “Nope, that’s not the way CDA 230 works. We think it’s a good thing for sites to try to create these kinds of civil places for conversation, but we’re not going to hold you liable if you are unsuccessful in making sure that every single comment on your site meets that criteria.”

If you pay a third party to create or submit content, you’re also going to be protected, so long as the author the material is not your employee. It’s a very complicated analysis sometimes, as to whether someone is an independent contractor or an employee. And if you have independent contractors that you use to create content — many of you do — you probably should have your lawyers look at those agreements to make sure that they are going to be truly classified as independent contractors.

The same issue applies if you have delivery people. Independent contractor issues pervade your business; I know that from my experience at The Washington Post. If they’re an independent contractor for you, then if they create something that is defamatory and invades someone else’s privacy and you publish it online, you’re not going to be liable for that content.

If you provide forms or dropdown boxes to facilitate user submission of content, you’re not going to lose your immunity as long as those forms and dropdown boxes are neutral. And I’ll talk in a little bit about what that means. But generally speaking, if you facilitate, through more than just an open text box, the submission of information from your users is not going to get you into any trouble.

One big question that comes up when I talk about these immunity provisions is: What happens if you’ve been notified by someone that a statement on your site is defamatory, or invades privacy, or somehow opens up a lawsuit? Do you have to take that material down? The answer to that is no.

The cases have been very clear in saying that you have no obligation to remove material from your site if you’ve been notified that it’s defamatory or otherwise problematic. If you think about that, it might strike you as: “Geez, that doesn’t sound right.” But courts don’t want to put you in the position of having to determine whether or not if something is defamatory or not.

Obviously, truth is a defense to the defamation claim. So you’d be in a position as a web-site operator of having to engage in factual investigation after publication to determine whether or not to remove material. For many small publishers, that would mean you take the material down, because you’re not in a position to be able to do that.

You certainly wouldn’t be able to subject it to the kind of rigorous reporting out that you do for your own material. And as a result of that, we would have a vastly constrained platform for people to publish information on these kinds of sites.

[Question: What if you do take it down? Are you admitting that it’s defamatory?]

No. Your liability will not change by your taking it down or leaving it up. The thing to keep in mind in that situation is that you can take it down — that’s entirely your decision. It’s not the law’s place to force you to do that. I think in covering those major activities, that’s most of what you’re doing today and probably most of the things you anticipate doing in the near future.

There are a couple of things that could take you outside of these immunity provisions. These are the kinds of things you want to be careful of. The first is if you edit the content to materially change its meaning. That could potentially take you outside of these immunity provisions. An obvious example of that is somebody puts up a post that says: “Jim Jones is not a murderer.” And you go and edit the content and you take out the word “not.” So all of a sudden it reads “Jim Jones is a murderer.” You’ve materially changed what the user submitted, and most courts would view you as a creator of the content and would not hold you to be immune. If all you did was remove a few adjectives, a few swear words out of a post, and you didn’t materially change the meaning, that’s not going to change your immunity. That’s pretty straightforward.

The other area that’s changed over the last year and a half or so is if you engage with your users through dropdown forms to create discriminatory content. This arose from a case in California involving Roommates.com, an apartment and real-estate matching service. Roommates provided dropdown boxes for their users — they required every user to make selections. For example, they had to provide personal information about themselves and also designate whether they wish to live with straight or gay males, only straight males, only gay males, or no males at all. Then they had similar provision for females.

Those of you that deal with FHA Fair Housing Act issues in regards to advertising, those same requirements applied to roommates.com, and a number of non-profit organizations that seek to prevent discrimination in housing, sued Roommates claiming that they had violated the Fair Housing Act’s non-discrimination provisions. The Ninth Circuit Court — the case arose in California, it bounced around in the Ninth Circuit — and ultimately the court said that CDA 230 does not immunize Roommates.com. It was a decision written by the entire — it was an en banc decision, so every judge on the Ninth Circuit actually had a role and looking at and analyzing this case. Judge [Alex] Kozinski was the author of the opinion; he’s a fairly Internet savvy judge. And he was very careful in — this was the first major case to really significantly limit the immunity provisions in CDA 230, so that’s what I’m spending a few minutes talking about it.

And what he said was, and what the court said in this case, was that they shared responsibility for creating the content because they forced their users to make discriminatory selections. Every user had to make that choice, and the only choices available to their users were these discriminatory selections. That doesn’t mean that all interactive features you may offer to your users are problematic. We’ve advised clients on this. If you are going allow users to choose dropdowns, for example, you would want those dropdowns to include positive and negative things. You would want those dropdowns to be perceived as neutral — you wouldn’t want to create dropdowns that say, “My mayor is — select a dropdown — a thief, a crook, a liar.” That’s not neutral. [Audience member: It’s true!] Obviously, truth is a defense, so you get sued and it turns out your mayor is a thief, a crook and a liar, you’re off the hook. But then you’re going to have to face litigation costs. The great thing about CDA 230 is it lets you get out of the case right at the start — you don’t have to face litigation costs.

So as you think about adding new — one of the things that I hope comes out of this conversation that we are having this morning and I know Josh is going to talk about it at the lunch — is how to interact more with your audience. That means more than simply giving them an open text box in which to have their say textually. That interaction can happen on a lot of ways, and CDA 230 shouldn’t interfere with innovative ways you want to allow users to submit content, apart from this limitation of creating tools that essentially are defamation-making machines. You don’t want to do that.

Let me give you the five takeaways from this.

— The first is if you passively host third-party content you are going to be fully protected under Section 230.

— If you exercise traditional editorial functions over user submitted content, such as deciding whether to publish, remove or edit material, you will not lose your immunity unless your edits materially change the meaning of the content.

— If you pre-screen objectionable content, correct or edit or remove content after publication, you are not going to lose immunity.

— If you encourage or pay third parties to create or submit content, you will not lose immunity.

— If you use dropdown forms or multiple-choice questionnaires, you should be cautious of allowing users to submit information through these forms in a neutral way.

Thank you.

Joshua Benton is the senior writer and former director of Nieman Lab. You can reach him via email (joshua_benton@harvard.edu) or Twitter DM (@jbenton).
POSTED     Jan. 14, 2009, 8:28 a.m.
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Newsweek is making generative AI a fixture in its newsroom
The legacy publication is leaning on AI for video production, a new breaking news team, and first drafts of some stories.
Rumble Strip creator Erica Heilman on making independent audio and asking people about class
“I only make unimportant things now, but it’s all the unimportant things that really make up our lives.”
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
“While there is even more need for this intervention than when we began the project, the initiative needs more resources than the current team can provide.”