Over the past decade, we’ve seen a rise in peer-to-peer “emotional support” platforms that randomly match unvetted anonymous strangers to each other. With the high cost of therapy and shortage of mental health professionals, chatting online with a peer at no cost seems like a simple solution. However, downloading a mental health app marketed as offering emotional support does not mean users will speak with vetted, trained professionals.
Mashable (led by Rebecca Ruiz) recently released a multi-part investigation on the growing marketplace for emotional support apps and platforms. While focused primarily on the platform 7 Cups, Ms. Ruiz’s coverage includes competitors Wisdo Health, Circles, HearMe and more.
Ruiz found that users, especially teens, are at risk while trying to find connection and support online. We took down Omegle – we know online platforms attract predators, giving them a platform to meet, groom, coerce and exploit children and teens.
But to be groomed on a mental health app? Every day through our work we see the ways that big tech contributes to the declining mental health of children and teens. Alongside politicians and advocacy groups, we’ve challenged them to come up with solutions on how to improve their products. These solutions, however, just seek to monetize the decline in mental health and create new opportunities to match predators with vulnerable people, like children and teens, and those who are really struggling mentally.
The investigation into 7 Cups found that some online abuse included discussion of sexual acts and fetishes, as well as encouraging suicide. A 14-year-old girl sent explicit photos of herself to a 42-year-old man posing as a 15-year-old boy, and he also encouraged her to leave her family and join him in another state.
Our founder Carrie Goldberg spoke to Mashable about these platforms and noted, “it’s a product that’s grooming people to be revealing very intimate details of their life.”
Tech companies foster the declining mental health in today’s youth, and then facilitate violence for those vulnerable people. And it’s in their playbook to use Section 230 of the Communications Decency Act as their defense so that they’re not liable for any harms caused by their platforms. While we wait for the very overdue overhaul of Section 230, big tech will continue using it as a shield when people are bullied, threatened and abused on their platforms. And like we took Omegle out of commission using product liability theory, we’ll continue embarking in the footholds where Section 230 doesn’t apply.
As Carrie also told Mashable, “[t]here’s not a culture where investors or founders are necessarily looking at the ways that a product can be abused, because they’re going in arrogantly thinking that they’re going to be immune from all harms that happen.”
Our team of experts will do everything they can to stop online harassment and protect your safety and peace of mind. If you are looking for support because of abuse, violence, or other extreme harms cause by a mental health or emotional support app, please contact us – https://www.cagoldberglaw.com/contact/ – or call 646-666-8908.