CDA 230 stands for Section 230 of the Communications Decency Act, a federal law that took effect in 1996. The CDA’s primary purpose was to tackle this growing problem Republican lawmakers had discovered existed on the world wide web – porn. Gasp! Porn was corrupting the youth.

Most of the CDA was found unconstitutional a year later (hence, porn literally everywhere on the internet) but the 26 words of Section 230 survived.

CDA 230 says “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The impetus for 230 was to deal with defamation lawsuits on online bulletin boards. A court case had ruled that if a bulletin board made any editorial decisions – like removing abusive or defamatory posts – they could then be held liable as the publisher of all content on the bulletin board. So lawmakers created Section 230 with the original intent to encourage Internet Service Providers (ISPs) like bulletin boards to responsibly moderate their content.

Instead, though, over the last twenty-three years, Internet Service Providers have looked to Section 230 as a shield that protects them from any responsibility stemming from harms that occur on their sites. It has had the exact opposite effect it was supposed to have – instead of incentivizing tech companies to moderate content responsibly, it rewards them for taking no action whatsoever. Companies need not even have policies or staff that deals with the content uploaded.

Last year we wrote about our case Matthew Herrick v. Grindr. Our client Matthew was the victim of impersonation by a vengeful ex-boyfriend on the dating app Grindr. His psycho ex used the app to arrange sex dates with over a thousand men at his home and workplace, many of which were under the impression that Matthew wanted to role play rape fantasies.

When Grindr claimed it couldn’t prevent Matthew’s ex from using the platform (yeah, right!), we, along with co-counsel Tor Ekeland Law, PLLC, sued Grindr under product liability theory. We told them, “If you did not design into your product a way to prevent known abusive users from continuing to use your product when it’s an arithmetic certainty that a dating app with geo-locating technology will sometimes be used by predators, stalkers, and rapists; then you’ve released a defectively designed product into the stream of commerce.” This is a perfect example of how laws like CDA 230 carve out an exception for certain types of companies to exempt them from the laws the rest of us have to follow. We are all at the mercy of the omniscient, omnipotent and ungodly wealthy tech industry. Until judges start interpreting CDA 230 differently or the law is reformed, tech giants will never be accountable for the harms their platforms cause. We have explored some of these harms below, to help explain how the CDA 230 is screwing all of us (except, y’know, Zuckerberg and pals).

FAQ’s ABOUT CDA 230

Q: What does revenge porn have to do with it?

A: CDA interpretation is so expansive, it protects websites and social media companies that allow users to upload your naked pictures without permission. In California, there’s case law that says even if you get a court order against somebody who posted abusive content, the website or social media company can’t be bound by it. That’s a huge problem for victims of cyber sexual abuse. The longer revenge porn is online, the greater harm it can cause the victim (getting fired from their job, losing custody of kids, having strangers harass them online and in the street, etc). So, when tech companies argue against laws that require them to take down these pictures, they’re putting themselves on the side of the offender. They’re saying that internet companies should have more power over a person’s nude pics or sex videos than the person depicted.

Big Tech is saying that they should get to decide whether a naked picture posted to the internet without my permission stays up or goes down? That’s f*cked up!

Q: So you want internet platforms to be responsible for monitoring every single thing users post?

A: No, we don’t want Instagram to fact-check every caption. We don’t want every individual who is defamed in a Facebook post to sue Facebook. We just want to hold tech companies to the same standards as any other industry. A more reasonable interpretation of CDA 230 would allow victims to sue platforms when they’ve suffered injury from foreseeable harms the platform caused. We’re talking about victims of nonconsensual pornography when the site has been notified about the content, families who lost loved ones in the Myanmar genocide that Facebook propaganda enabled, politicians who lose elections because social media companies run ads spreading lies, people whose rapes and murders are livestreamed, children whose images of sexual abuse are shared on mainstream cloud storage servers. The current statute has been understood to protect internet service providers from any responsibility whatsoever for the sh*t that happens on their platforms. Any other company can be sued for creating a defective product, so why not Big Tech?

Q: But won’t reforming CDA 230 hurt free speech?

A: No! The CDA isn’t about protecting speech at all. The lobbyist groups and academics drunk off the kool aid want everybody to think that if tech companies can be held liable for the harms they cause, they will over-censor and free speech on the internet will die. They need to stop being so naïve – or pretending we are. Platforms like Facebook, Google, YouTube, Twitter, Tinder – they use speech, connection, and engagement as lures to get us hooked on the convenience of their products. But they are not here to serve us in some noble democratizing way. They make money from selling ads to us and collecting our data. It’s okay if these companies get hauled to court once in a while for the harms they cause. It’s the only way they can be incentivized to keep their products responsible. And just like every other industry, they’ll simply get liability insurance to cover the costs of litigation. In fact, they probably already have it. Getting rid of CDA 230 wouldn’t make platforms responsible for removing speech that would otherwise be protected. It would just make them responsible for the abuses they create. As Neil Turkewitz put it: “Section 230 doesn’t protect speech — that’s what the First Amendment does. Section 230 protects a special category of actors.”

Q: So… who does it protect?

A: The law protects “interactive computer service providers,” which has been defined as “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server.” Courts have recognized dating apps like Grindr, social media platforms like Facebook, and other big tech companies including Amazon and Google as falling under this protection. If it’s a digital platform, you can pretty much expect that it’s covered by CDA 230. This is true even for platforms that are created for the purpose of doing illegal things, like posting revenge porn. While the types of harms might differ among different types of ISPs, the principle is the same: if you know your platform is hurting people, you should do something about it.

Q: I keep hearing Republicans talking about CDA and neutrality. What’s that all about?

A: We think internet providers should be held liable if content causes injury. The GOP thinks internet providers should be held liable if content is removed based on unpopular political ideas. In June, Republican Sen. Hawley introduced a bill to require ISPs to prove that they are politically neutral in order to get protection under CDA 230. This is not the change to Section 230 we want, but it is an interesting moment where we agree with the far right that the power of these tech companies must be contained. Our priority is not about “discrimination against conservative viewpoints” and more about the actual abuse that our clients are experiencing online.

Q. So what do we do?

A: We need to fix CDA 230. And we need to demand our elected officials and presidential hopefuls tell us their position. Big tech has a lot of money and a lot of lobbyists We need to be able to get emergency court orders to remove content like impersonating dating and sex ads that are causing imminent danger. We need CDA 230 immunity to apply only to claims relating to publication – like obscenity and defamation. If a company being sued thinks it’s entitled to immunity, they should have to plead that – instead of courts just dismissing the case early on.

When we give an enormous industry blanket immunity from the dangers of their platforms, we shouldn’t be surprised when they allow psychos and trolls to inflict serious harm. The justice system is where people are supposed to get justice. We should all be terrified as f*ck if the tech industry is outside the reach of our courts.

 

If you are a victim of online abuse and want to sue the sh*t out of the platform that enabled it, we’re here to help! Contact us at 646-666-8909 or through our online submission form.

Sign up to our Newsletter and follow us on Facebook, Twitter, and Instagram to stay up to date with important news as it breaks!

Want to know more about what’s wrong, how it has very real consequences for victims, and what can be done to help? Carrie’s book, Nobody’s Victim, is the must-read. A bold and much-needed analysis of victim protection in the era of the Internet, Nobody’s Victim is an urgent warning of a coming crisis, a predictor of imminent danger, and a weapon to take back control and protect ourselves—both online and off.

 

You might also want to check out:

The Top 5 Myths about Online Abuse

Revenge Porn Laws by State

Incident Tracking Chart