Connect with us

AI

Google, Character.ai settle lawsuits over teen suicides

Google and Character.AI are settling lawsuits that highlight the urgent need to address the mental and emotional harm AI can cause.

Hands typing on a laptop with icons.
Image: Google

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.

Google and Character.AI are quietly settling a wave of lawsuits that could reshape how we think about AI accountability—and it’s about damn time someone asked the hard questions.

The settlements involve at least five cases across multiple states where families allege AI chatbots contributed to teenage suicides and self-harm, according to court filings.

The most prominent case involves Megan Garcia, whose 14-year-old son Sewell Setzer III died by suicide after developing what the lawsuit describes as an “inappropriate and intimate relationship” with Character.AI’s chatbots.

According to court filings, the chatbot allegedly sexually solicited and abused the teen, then failed to respond adequately when he discussed self-harm.

It’s the kind of nightmare scenario that sounds like dystopian fiction, except it’s very real and happening right now.

Here’s where it gets messy: Google hired Character.AI’s founders—former Google employees Noam Shazeer and Daniel De Freitas—in 2024 and paid for non-exclusive rights to their technology.

Character.AI remains separate legally, but the corporate entanglement is undeniable.

Megan Garcia cuts through the corporate doublespeak with a devastating question:

“When an adult does it, the mental and emotional harm exists. When a chatbot does it, the same mental and emotional harm exists. So who’s responsible for something that we’ve criminalized human beings doing to other human beings?”

That’s the multi-billion-dollar question these settlements are designed to avoid answering in court.

Four additional cases in New York, Colorado, and Texas have also been settled, according to attorney Matthew Bergman, who represents the families.

These are the first major legal resolutions holding AI companies accountable for harm to minors—but they definitely won’t be the last.

Follow us on Flipboard, Google News, or Apple News

Kevin is KnowTechie's founder and executive editor. With over 15 years of blogging experience in the tech industry, Kevin has transformed what was once a passion project into a full-blown tech news publication. Shoot him an email at kevin@knowtechie.com.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

More in AI