As reported by K-12 Dive, Character.AI and Google have agreed to mediate a settlement with the mother of a 14-year-old who died by suicide after interacting with Character.AI’s artificial intelligence companion. In an October 2024 wrongful death lawsuit against the tech companies the child’s mother alleged Character.AI was negligent in its “unreasonably dangerous designs” and that it was “deliberately targeting underage kids.” She also averred that Character.AI knew its AI companions would be harmful to minors but failed to redesign its app or warn about the product’s dangers.

In a similar case with a Colorado family, Google and Character.AI agreed to settle a suit over the wrongful death of their 13-year-old daughter. The pending settlements come as a similar lawsuit filed by the Social Media Victims Law Center and Tech Justice Law Project on behalf of a family alleges generative AI tools used by ChatGPT led to their child’s suicide challenges.

For more from K-12 Dive, click here.