How many more children have to die: One year since Frances Haugen blew the whistle on Facebook, what has changed?   

In September 2021 The Wall Street Journal began publishing The Facebook Files, a series of investigative reports into Facebook (now Meta). They revealed that Facebook ignored research and warnings about dangerous product design decisions, and exposed vulnerable groups of people to dangerous content. For one of those groups, teenage girls, Instagram’s promotion of eating disorders, self-harm and suicide proved particularly toxic.  

On October 4, 2021, Frances Haugen revealed on 60 Minutes that she was the whistleblower that had released hundreds of internal documents giving the world a disturbing look under the hood of the social network. “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimize for its own interests, like making more money,” Haugen explained.

This is something C.A. Goldberg, PLLC has been acutely aware of since our inception in 2014, and became increasingly evident in our fight to reform Section 230 of the Communications Decency Act, a law created in 1996 that shields tech companies from liability for all the ways they harm people. During 2021, we brought lawsuits against tech companies including Amazon, for selling suicide products to kids, and Omegle, which caused an 11-year-old Plaintiff to spend three years in sexual servitude to a pedophile who gained access to her and was able to groom her through the online chat room.   

  

One year on   

A lot has changed in the year since the Facebook Files dropped. For one, Facebook’s name. By October 28, Facebook had announced they would be rebranding as “Meta.” They didn’t specify that it was an attempt to distance themselves from their creepy public image. If anything, they embraced it. For Zuckerberg, Meta marked the beginning of a strategy to pull users to the ‘Metaverse’, a virtual platform where they should spend increasing, intensifying amounts of their lives.  

But the change of name didn’t do enough to distract from the content of documents released by Haugen which, along with her testimony to lawmakers in the US and internationally, had illuminated both Facebook’s dangerous approach to hate speech, misinformation, terrorism, ethnic cleansing, and exploitation of its most vulnerable users: children and teens. And shone a bright light on Section 230.  Facebook revenue, 2021: $117.9 billion, up 37% year-on-year.   

  

Carrie testifies in Congress   

In December 2021, our founder Carrie Goldberg testified in the U.S. House of Representatives at a hearing entitled “Holding Big Tech Accountable: Targeted Reforms to Tech’s Legal Immunity.”   

Her testimony explained how Section 230 of the Communications Decency Act (a 90’s law originally intended to combat pornography) had metastasized it into a shield protecting Big Tech – arguably the most powerful companies in the world – from responsibility for terrorism, genocide, child sexual exploitation, illegal firearms dealing, and stalking. “Ironically, my clients, especially my exploited underage clients, are the ones that the 1995 Congress was trying to protect. Yet, this is the population most victimized by the creep of immunity,” she explained.   

“In the majority of my cases, well over a thousand now, my clients’ injuries were facilitated by tech companies,” she explained. “The people — victims of child sexual exploitation, cyberstalking, trafficking — hire me as their lawyer expecting me to avenge their damages. The worst part of my job, though, is telling people who’ve suffered horrific nightmares that Congress took away their right to justice. We can’t sue, I tell them, Congress passed a law in the 90’s that lets tech companies get away with what they did to you.”- Carrie Goldberg to Congress, Dec 2021  

 

In the past year, Instagram has instituted some functional changes such as parental supervision tools that, whilst appearing like progress, act to shift the responsibility for keeping child users safe into the hands of parents who may be unaware, overstretched, or ill-equipped to protect their kids from the strategic insidiousness of social media products. What’s more, The Facebook papers revealed that not only was Meta invested in hacking the reward centers of children’s brains to addict them to their products – they were also deeply interested in the phenomenon of young people have secondary, secret accounts that guardians did not know about.   

  

September 27, 2021:  Facebook puts a hold on the development of a kids’ version of Instagram, geared toward children under 13.  

  

If tech companies won’t change, state laws must  

This year, Carrie Goldberg collaborated with New York state Senator Andrew Gounardes to create a bill, The New York Child Data Privacy and Protection Act, which would require tech companies commit to privacy and safety for kids and create new obligations that online services need to follow if children use their product.  

It was initiated on the back of a bill passed in California – the California Age-Appropriate Design Code Act – which requires tech companies install guardrails for child users or face fines and injunctions from the Attorney General. It passed in the same week that the Australian government issued demands to Omegle, Meta, Snap and other tech companies, requiring them to report the measures they are taking to tackle child sexual exploitation material on their platforms and services.   

The California law, in turn, had been inspired by the UK’s Age-Appropriate Design Code – the first statutory code of practice for children’s data anywhere in the world – which requires online services consider the best interests of children when they are designing and developing their products and offer privacy protection by default.   

Though comprehensive and impressive, this came too late for Molly Russell, a child whose suicide sparked an inquest which closed in the UK last week with a ruling that Instagram and other social media platforms had contributed in a “more than minimal way” to her death.    

   

The tragic death of a teen, and a grieving family dedicated to tech accountability   

The widely publicized investigation into Molly’s death, which was driven by her devastated father, shone a light on the hideous content children are fed by companies like Meta and Pinterest during what the coroner described as algorithm-driven “binge periods”. An expert witness said the self-harm content he had viewed during the investigation was so “disturbing” and “distressing” that it caused him to lose sleep for weeks. The coroner in charge identified that the social media companies pushed content on the child that “sought to isolate and discourage discussion with those who may have been able to help.”  

The inquest also spotlighted the cold indifference of big tech execs. Meta  – whose representative Elizabeth Lagone arrived with decoy vehicles and bodyguards that, according to the BBC, “manhandled” press away defended its practices overall as an attempt to preserve free expression. Langone claimed Meta’s understanding was that there wasn’t clear research into the effect content has on children, and that it is ”safe for people to be able to express themselves.”  

Essentially: sucks for you, but free speech.  

Jud Hoffman, an executive at Pinterest, admitted that Pinterest was not safe at the time Molly accessed it, and acknowledged that recommendation emails sent by Pinterest to Molly (including “10 depression pins you might like”) contained “the type of content that we wouldn’t like anyone spending a lot of time with.” He admitted he wouldn’t want his own children viewing this content on Pinterest. (Just ours, presumably.)  

“It’s a ghetto of the online world that once you fall into it, the algorithm means you can’t escape it and keeps recommending more content,” Molly’s father Ian Russell said during his testimony. He later described it as a “demented trail of life-sucking content.”   

The inquest’s findings sent shockwaves across the water to Silicon Valley.   

This article is full justification for why we need to be able to sue Meta, et al. Here is an exec under oath defending Instagram’s publication of suicide content. And an exec from Pinterest admitting their platform was unsafe. In these lawsuits, among other things, we see which defendants take responsibility and which are making a policy decision to addict, injure and kill more kids. – @cagoldberglaw on twitter   

  

Back in the USA  

Last week, Senators Edward Markey and Richard Blumenthal, and Representatives Kathy Castor and Lori Trahan, sent a letter asking the FTC to update COPPA regulation (Children’s Online Privacy Protection Act) to protect kids on tech platforms.  

They urged the US to look to the UK’s Age-Appropriate Design Code to update COPPA with “additional protections that address pressing threats online” including expanding the definition of ‘personal information’, requiring that platforms protect the confidentiality, security, and integrity of children’s data, updating COPPA to reflect the platforms kids actually use, defining platforms that are directed towards children, and implementing regulatory protections that reflect changes in online advertising practices.  

“In countries around the world, government entities have begun to take action by enacting policies that combat pernicious online threats to kids. The United States must now do the same,” urged the lawmakers.   

“Studies show that kids lack the cognitive ability to defend themselves against common practices in the digital ecosystem. Kids’ screen time has doubled during the pandemic; 35 percent of parents report that their children began using social media at a younger age than they had originally planned; and close to half of teens say they are online “almost constantly.”  Experts agree that we have reached a crisis point for children and teens online.” – Lawmakers letter to the FTC   

Multidistrict litigation 

This year mass tort lawyers from all around the country have joined us in the fight against tech giants and have adapted the product liability theories we used in Herrick v. Grindr and A.M. v. Omegle to argue that products like Instagram, Facebook, TikTok, Snap, and YouTube are defectively designed to addict young users. The cases –88 have been filed so far – allege that these social media products are specifically designed to maximize screen time and usage. And the result are young people saddled with severe eating disorders, crippling anxiety and depression, and in some cases completed suicides. 

Haugen’s papers are the basis for much of our new knowledge about just how much harm these products know they’re causing to users. In July our friends at Beasley Allen filed for these cases to be joined in a multidistrict litigation.  On Thursday, our founder Carrie Goldberg was in St Louis to argue in front of the Judicial Panel on Multidistrict Litigation (JPML) where lawyers for Meta, Bytedance Google, and Snap mostly conceded that an MDL was inevitable. 

The creation of an MDL to litigate against big tech will be a historic new chapter. 

 

What now? 

On October 3rd 2022, The Supreme Court took up a challenge to CDA Section 230, Gonzalez v. Google. The consequences of this are potentially huge.  

This case is driven by yet another grieving father avenging his young daughter’s death.  

Nohemi Gonzalez was a 23-year-old killed in an ISIS attack in 2015. Her father sued Google for promoting pro-ISIS videos and content. The complaint alleges that they “provided services that were central to ISIS’s growth and expansion, and that the assistance was provided over many years.” The Supreme Court will now consider whether Google is protected by Section 230 immunity when it recommends terrorism content, and enables terrorist groups to spread their message, recruit members, and raise money. 

Attorneys for Nohemi’s dad, Reynaldo Gonzelez, wrote in the petition to the Supreme Court: “Interactive computer services constantly direct such recommendations, in one form or another, at virtually every adult and child in the United States who uses social media. Application of §230 to such recommendations removes all civil liability incentives for interactive computer services to eschew recommending … harmful materials, and denies redress to victims who could have shown that those recommendations had caused their injuries, or the deaths of their loved ones.” 

It’s thought that the Supreme Court’s decision to take up Gonzelez vs. Google was driven by Justice Clarence Thomas, who has previously suggested the court should re-consider §230. In 2020, Justice Thomas acknowledged our case against Grindr in a major statement, citing our theory that tech companies should not be immune from product liability and that CDA §230 immunity must be limited to content – not conduct. Thomas explained that courts have “departed from the most natural reading of the text by giving Internet companies immunity for their own content.” 

Gonzelez v. Google is set to be the Supreme Court’s first test of the immunity social media companies enjoy. It’s a pivotal moment. It’s overdue. But it’s promising a sign that the tide is turning, and fast.  

For some, it comes too late. Our clients A.M., Ruth Scott, the Devins family, the many families we represent who’s children died after purchasing drugs on Snapchat, and more whose lives have been overturned or ended by mainstream tech products facilitating child sexual exploitation, drug overdoses, and suicide.  

Companies like Google, Snap, Meta, Amazon do not get to shirk responsibility by blaming an algorithm. They create the algorithm. They hoard vast wealth from the algorithm. They should not get to walk away when the products they build kill and maim people.

 

 

Connect with us on LinkedInInstagramFacebookYoutube, and Twitter, to stay up to date with important news and free resources.