In the fourth installment of our blog series drawn from Carrie Goldberg’s congressional testimony, we explore the Section 230 reform proposals on the table.

At the December 1 hearing in Congress, four bills were discussed.  Each is a step in the right direction toward correcting the over-breadth our courts interpreted into Section 230. Each of them contains laudable elements. Here is Carrie’s take on them:

1 The Civil Rights Modernization Act of 2021 (H.R. 3184)

This bill removes the liability exemption for targeted ads that discriminate. This is important because companies should not be immune from liability for content like ads they profit from. I’d take it steps further. In my opinion, the monetization of content ought to transform the Interactive Computer Service into an Information Content Provider with relation to said content.  Alternatively, with paid content, the platform is not truly performing the role of an Interactive Computer Service, but rather the role of a billboard or marketer.  Narrowing the definition of ICSs and ICPs accordingly to address when paid content transforms these designations is a more expansive way to address the issue.  Lastly the CRMA only applies to claims pertaining to injuries stemming from civil rights violations. Unfortunately, this would exclude the most vicious harms and serious injuries we see online which are not typically based on discrimination, but violence.

2 The Protecting Americans from Dangerous Algorithms Act (H.R. 2154)

This bill smartly and appropriately classifies algorithms as ICP. This law is so narrow as to be almost unusable. It excludes so many presentations of algorithms (i.e., ranked, ordered, promoted, recommended, amplified, or altered in a way that is obvious, understandable, and transparent to a reasonable user, chronologically listed, sorted by user ratings or numbers of reviews, alphabetical, random, organized by views, downloads or similar.). It applies only to large companies which is a disappointing delimiter since some of the most deliberately malicious platforms are small. Oddly, it rules out all claims except those pertaining to equal rights and injuries from international terrorism.  My clients would stand no benefit.

3 The Justice Against Malicious Algorithms Act (H.R. 5596)

This bill is a far superior way to address injuries caused by algorithms than H.R. 2154. This bill creates a new exception to immunity for suits against ICS’ for injuries caused from algorithmically directing a user to material that causes them injury.  I could envision this applying to scenarios exposed recently such as Instagram directing teen age girls to thinspiration and weight loss content and playing a causal role in the eating disorders that result. I further envision this benefitting my clients who are matched through dating apps with dangerous individuals, such as my client, Matthew Herrick.  I do fear that the exception to the exception for user-specified searches will create a sinkhole many worthy plaintiffs will fall into.  I also recommend against the narrowing mens rea requiring the ICS “knew or should have known” or “acted recklessly.”  These will be abused by defendants an over-interpreted by courts. Lastly, injuries ought not be limited to “physical or severe emotional injury.” Although most of my clients do suffer severe emotional injury, we must not overlook financial injuries and the like.

4 The Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms Act (“SAFE TECH Act”) (H.R. 3421),

Of the four bills, this one will vindicate the injuries I deal with the most.  First, in (c)(1) it changes the immunity from applying to “any information” provided by a third party to “any speech” provided by a third party. Hopefully courts will recognize this as an attempt to for courts to start distinguishing between content (immune) and conduct-based (hopefully not immune) harms.

SAFE TECH also removes immunity for speech that the ICS profited from or funded.  As I have long advocated, this bill requires that 230 immunity be an affirmative defense for which the defendant has the burden of persuasion. This has the advantage of unburdening the court with having to play computer scientist in a 12(b)(6) decision, determining without facts from the defendant that they are in fact an ICS being treated as the publisher or speaker of information provided by another ICP. Other excellent provisions of SAFETECH include the immunity exclusion for injunctive relief which is important when a victims’ main concern is getting illegal material such as nonconsensual pornography or child sexual exploitation material off the web site.  Finally, it creates five new categories of carve-outs: claims relating to 1) civil rights laws, 2) antitrust laws, 3) stalking, harassment or intimidation laws, 4) international human rights laws, and 5) wrongful death actions.

The carve-outs are good, especially in that they apply to both state and federal laws but require some tweaking of the language in order to be useable by injured plaintiffs in litigation. Most specifically we need the carve-outs to apply to the facts and not laws. A technical point, but an important one. Many crimes, such as stalking, harassment, human rights abuses, do not have a private right of action. That is, victims can’t sue for the violation of these laws, but rather must use classic tort law to make their claim. For instance, my stalking clients in New York could not bring a lawsuit against their stalker or a platform that facilitated the stalking because there is no private right of action for stalking. Stalking is only a criminal law.  Instead, a victim would sue the platform for negligence or breach of contract or negligent infliction of emotional distress. We also must eliminate the delimiter on the stalking, harassment and intimidation carve-out which requires “in whole or in part” that the harmful conduct be based on a protected class (i.e. sex (including sexual orientation and gender identity), race, color, religion, ancestry, national origin, physical or mental disability).