Omegle lawsuit update: Groundbreaking C.A. Goldberg Section 230 case prevails on product liability claims

 

Last week the Oregon District Court issued its decision that our case A.M. v. Omegle will advance on all four of its product liability claims, overcoming Omegle’s argument for Section 230 immunity.

This is massive progress on Section 230 liability for dangerous tech products.

 

The product:

Omegle is a website founded in 2009 that randomly pairs strangers for video chats.  It markets itself to children. The platform’s common use is for strangers to engage in online sex with one another. The design of the platform – the anonymity of users and lack of age inquiries or verifications — makes it inevitable that children and adults will match.  Consequently, the platform is a magnet for child predators. 

 

The lawsuit: A.M v. Omegle.com LLC

According to our lawsuit, filed in November 2021: “This is a product liability case against a company whose product procures children for sexual predators.”

True to its design, Omegle matched our (then) 11-year-old client with a near 40-year-old pedophile who abused her for years.

Our lawsuit claimed that Omegle is defective and negligent in both its design and warnings. The lawsuit points to the fact that Omegle has continued to be the subject of investigation from cybercrime experts and law enforcement officers all over the world yet has taken no known action to mitigate the frequent occurrences of child exploitation on their platform. Indeed, they tout the dangers of their product on their own homepage.

 

This week’s decision:

The decision we received this week denied Omegle’s Motion to Dismiss on the basis of Section 230. Though it was originally designed to deal with defamation lawsuits involving online bulletin boards, over the last 23 years tech companies have twisted Section 230 in to a shield that protects them from any responsibility for dangerous practices. Consequently, 230 has had the opposite effect it was supposed to have. Instead of incentivizing tech companies to behave responsibly, it carved out an exception for certain types of companies to exempt them from the laws the rest of us have to follow.

“Omegle argues it should be entitled to immunity. I disagree,” wrote Michael W. Mosman, Senior United States District Judge of Oregon District Court where the case was filed.

Judge Mosmon agreed that “What matters for purposes of those claims is that the warnings or design of the product at issue led to the interaction between an eleven-year-old girl and a sexual predator in his late thirties.” As Naomi Leeds wrote in our motion, and Judge Mosman quoted in the decision, “The random pairing function of adults and children and the service’s accessibility to both adults and children work in tandem. Plaintiff’s claims thus have nothing to do with information provided by a user. It is the website’s sole function of randomly matching children with adults that causes the danger. This function occurs before content occurs.” 

 

What’s next?

This is far from over. We’re now waiting to hear on the second Motion to Dismiss. Omegle is arguing their product is not a product. We will be arguing against that in August.  Watch this space.

We are confident we will overcome the product issue as we did Section 230, but our case against Omegle is a reminder of why any legislation modifying 230 cannot apply only to large internet companies. Omegle doesn’t touch Meta in size, but the product is about as harmful as it gets. Livestream video-chatting for sex that matches children with adults.

Remember: The harms are not a bug. They’re a feature.

 

Read the decision here or below:

 

Appearing for plaintiff is Carrie Goldberg and Naomi Leeds from C.A. Goldberg, PLLC and Barb Long from Vogt Long.

 

 

 

Now read: