In the final part of our blog series on Carrie Goldberg’s congressional testimony, we bring you Carrie’s proposal for Section 230 reform

“We have a real mess here, but a fixable one.”

Congress created Section 230, and Congress has the power to fix it. Any proposals for reform I consider through the lens of the most wrenching harms I see in my office.

Americans are being injured by tech companies running amuck, unconstrained by regulation, liability for their product, or the threat of litigation.  Everyday people have lost their fundamental right to access to the courts to vindicate their injuries. This has created an undeserved windfall for the tech industry, allowing it to become the most powerful, wealthy, omnipotent, and omniscient industry in the history of the world. The trio of corporations, courts, and Congress birthed a monster. Through legislative reform, Congress can fix what corporations won’t because of greed and courts can’t because of bad accumulated case law. Anybody could become my next client. What is illegal online, should be illegal offline.

Any legislation must distinguish between hosting defamatory content versus enabling criminal conduct. The first deserves 230, the second does not.

Short of eradicating Section 230 my recommendations (below) are most similar to the Key Takeaways and Recommendations published by the Department of Justice in June 2020 after its February 2020 symposium “Section 230 – Nurturing Innovation or Fostering Unaccountability.” I was relieved to see many of my own recommendations from that event adopted by DOJ.  These recommendations recognize that large tech platforms are no longer nascent or fragile, if ever they were, preserves competition, and keeps core immunity for defamation to foster free speech.

 

Proposal for Section 230 reform:

  • Conduct carve-outs
    • Bad Samaritan carve-outs – no immunity from civil liability for platforms that:
      • purposefully facilitate or solicit third party content or activity that violates criminal law;
      • Are willfully blind to illicit conduct, (e.g. failure to detect or respond to illegal conduct, preventing or seriously inhibiting swift detection and banning of offenders, impeding law enforcement’s ability to investigate and prosecute serious crimes, and depriving victims of the evidence they need to bring civil claims against their perpetrator)
    • Egregious conduct carve-outs – no immunity for the worst type of conduct — claims involving child exploitation, sexual abuse, terrorism, and stalking. Section 230 was never intended to shield platforms from liability so far outside the original purpose of the statute
    • Actual knowledge and court judgments – no immunity where a platform has actual knowledge or notice that the third party content violates criminal law or ignores a court order indicating that content is unlawful or that published content or conduct on a platform underlies a criminal case or civil restraining order.
  • Injunctive relief to help in emergency cases where the plaintiff is suffering imminent harm because of harms on a platform or a court has ruled content unlawful or when the basis of a criminal case or civil restraining order is content or conduct occurring on a platform
  • The ICS is the ICP and therefore not entitled to immunity for claims pertaining to
    • Breaches of its own terms of services;
    • Breached promises made to users or the public;
    • Testimony of its executives under oath;
    • Constructive notice of the specific harm and damages; or
    • Paid content, including in-kind payment. This includes payment to or from the ICS;
    • Content recommended to users via algorithm;
    • Defectively designed or manufactured products or failure to warn;
  • Define “information content” to include only speech-based content
  • Limit immunity to only publication-related torts like obscenity and defamation.

For a full redline version of Carrie’s proposed reforms, the Herrick Act Against Violence Online (“HAAVO”), please see below (edits in italics):

A BILL

To amend Section 230 of the Communications Act of 1934 to reaffirm victims’ rights and consumer protections.

SHORT TITLE

This Act may be cited as the Herrick Act Against Violence Online (“HAAVO”)

(c) PROTECTION FOR “GOOD SAMARITAN” BLOCKING AND SCREENING OF OFFENSIVE  MATERIAL.

(1) TREATMENT OF PUBLISHER OR SPEAKER. (A) No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information speech provided by another information speech content provider.

(2) CIVIL LIABILITY. No provider or user of an interactive computer service shall be held liable on account of—

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, promoting terrorism or violent extremism, harassing, promoting self-harm, or unlawful, or otherwise objectionable, whether or not such material is constitutionally protected; or

(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in Subparagraph (1).

(d) EXCLUSION FROM “GOOD SAMARITAN” IMMUNITY.

(1) “BAD SAMARITAN” CARVE-OUT. Subsection (c)(1) shall not apply in any criminal prosecution under State law or any State or Federal civil action brought against an interactive computer service provider if, at the time of the facts giving rise to the prosecution or action, the service provider acted purposefully with the conscious object to promote, solicit, or facilitate material or activity by another content provider that the service provider knew or had reason to believe would violate Federal criminal law, if knowingly disseminated or engaged in. 

(2) CARVE-OUT FOR ACTUAL NOTICE OF FEDERAL CRIMINAL MATERIAL. Subsection (c)(1) shall not apply in a criminal prosecution under State law or any state or Federal civil action brought against an interactive computer service provider if—

(A) such prosecution or action arises out of a specific instance of material or activity on the service that would, if knowingly disseminated or engaged in, violate Federal criminal law; 

(B) the provider had actual notice of that material’s or activity’s presence on the service and its illegality; and

(C) the provider failed to do any of the following:

(i) expeditiously remove, restrict access to or availability of, or prevent dissemination of the specific instance of material and take reasonable steps to remove, restrict access to or availability of, or prevent dissemination of the material across the service;

(ii) thereafter report the material or activity to law enforcement when required by law or as otherwise necessary to prevent imminent harm; or

(iii) preserve evidence related to the material or activity for at least 1 year.

(3) JUDICIAL-DECISION CARVE-OUT. Subsections (c)(1) and (2) shall not apply in any criminal prosecution or civil action or injunction arising from the failure of an interactive computer service provider to remove, restrict access or availability to, or prevent dissemination of material within a reasonable time after receiving notice of a final judgment from a court in the United States indicating that such material or activity is defamatory under state law or unlawful in any respect. However, no interactive computer service provider shall be held liable for removing, restricting access to, or preventing dissemination of material in response to receiving such notice.

(4) NOTICE MECHANISM REQUIREMENT. An interactive computer service provider shall make available to the public, without expense, an easily accessible and apparent mechanism for notifying the provider of defamatory or unlawful material or activity as described in Subsections (d)(2) and (3). An interactive computer service provider shall not be entitled to assert immunity under Subsection (c)(1) if it designs or operates its service to avoid receiving actual notice of Federal criminal material on its service or the ability to comply with the requirements under Subsection (d)(2)(C).

(d) (e) OBLIGATIONS OF INTERACTIVE COMPUTER SERVICE. A provider of interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors. Such notice shall identify, or provide the customer with access to information identifying, current providers of such protections.

(e) (f) EFFECT ON OTHER LAWS.

(1) NO EFFECT ON CRIMINAL LAW OR FEDERAL CIVIL ENFORCEMENT. Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this Act, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, United States Code, or any other Federal criminal statute. Nothing in this section shall be construed to prevent, impair, or limit the enforcement by the United States, or any agency thereof, of any civil Federal statute or regulation.

(2) NO EFFECT ON INTELLECTUAL PROPERTY LAW. Nothing in this section shall be construed to limit or expand any law pertaining to intellectual property.

(3) STATE LAW. Nothing in this section shall be construed to prevent any State from enforcing any State law that is consistent with this section. No cause of action may be brought, and no liability may be imposed, under any state or local law that is inconsistent with this section.

(4) NO EFFECT ON COMMUNICATIONS PRIVACY LAW. Nothing in this section shall be construed to limit the application of the Electronic Communications Privacy Act of 1986, or any of the amendments made by such Act, or any similar State law.

(5) NO EFFECT ON SEX TRAFFICKING LAW. Nothing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit—

(A) any claim in a civil action brought under section 1595 of title 18, United States Code, if the conduct underlying the claim constitutes a violation of section 1591 of that title;

(B) any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 1591 of title 18, United States Code; or

(C) any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 2421A of title 18, United States Code, and promotion or facilitation of prostitution is illegal in the jurisdiction where the defendant’s promotion or facilitation of prostitution was targeted.

(6) NO EFFECT ON ANTI-TERRORISM CLAIMS. Nothing in this section shall be construed to prevent, impair, or limit any claim in a civil action brought under section 2333 of title 18, United States Code.

(7) NO EFFECT ON CHILD SEX ABUSE OR CHILD SEXUAL EXPLOITATION CLAIMS. Nothing in this section shall be construed to prevent, impair, or limit any civil action brought under state or federal law relating to claims of child sexual abuse or child sexual exploitation.

(8) NO EFFECT ON CYBER-STALKING LAWS. Nothing in this section (other than 12 subsection (c)(2)(A)) shall be construed to prevent, impair, or limit any civil action in state or federal court relating to harm suffered from conduct that would constitute a violation of section 2261A(2) of title 18, United States Code.

(9) NO EFFECT ON ANTITRUST LAWS. Nothing in this section shall be construed to prevent, impair, or limit any civil action brought under the Federal antitrust laws.

(10) NO EFFECT ON PRODUCT LIABILITY CLAIMS. Nothing in this section shall be construed to prevent, impair, or limit any civil action brought against an Interactive Computer Service for its own defects in its design, manufacture, or failures to warn users and the public of serious risks.

            (11) NO EFFECT ON WRONGFUL DEATH ACTIONS. – Nothing in this section shall be construed to prevent, impair, or limit any civil action for a wrongful death.

(f) (g) DEFINITIONS. As used in this section:

(1) INTERNET.

The term “Internet” means the international computer network of both Federal and non-Federal interoperable packet switched data networks.

(2) INTERACTIVE COMPUTER SERVICE.

The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.

(3) INFORMATION SPEECH CONTENT PROVIDER.

The term “information speech content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service. Being responsible in whole or in part for the creation or development of speech includes, but is not limited to, instances in which a person or entity solicits, comments upon, receives payment or payment in-kind for, funds, algorithmically directs, provides testimony under oath as an executives employed by the interactive computer service, or affirmatively and substantively contributes to, modifies, or alters speech provided by another person or entity.

(4) ACCESS SOFTWARE PROVIDER.

The term “access software provider” means a provider of software (including client or server software), or enabling tools that do any one or more of the following:

(A) filter, screen, allow, or disallow content;

(B) pick, choose, analyze, or digest content; or

(C) transmit, receive, display, forward, cache, search, subset, organize, reorganize, or translate content.

 

 

Now Watch:

Harmed By Section 230: Barring Access to the Courtroom from the Alliance to Counter Crime Online