Character.AI Faces Lawsuits Linking Chatbots to Teen Harm
The families of three minors have filed lawsuits against Character Technologies, Inc., the developer of Character.AI, alleging the chatbots contributed to their children’s deaths, suicide attempts, and other harms.
Cases were filed in Colorado and New York. Defendants include Character.AI co-founders Noam Shazeer and Daniel De Freitas Adiwarsana, along with Google’s parent company, Alphabet, Inc.
One lawsuit says Juliana Peralta, a 13-year-old from Colorado, became “addicted” to Character.AI, withdrawing socially, struggling in school, and losing sleep as the chatbot’s messages escalated to “extreme and graphic sexual abuse.”
Juliana downloaded the app in August 2023 when it was rated 12+ on Apple’s App Store, requiring no parental approval. She used it without her parents’ knowledge or permission.
The lawsuit claims Character.AI failed to flag or stop Juliana’s suicidal plans, alert her parents, or report to authorities. Instead, the app allegedly damaged her emotional bonds with family and peers. Her parents are seeking damages and stronger safeguards to protect minors.
A Character.AI spokesperson said the company collaborates with teen safety experts and invests heavily in its safety program. They expressed condolences, saying they were “deeply saddened” by Juliana’s death.
In another lawsuit filed in New York, the family of a girl referred to as “Nina” alleged that their daughter attempted suicide after her access to Character.AI was restricted. This complaint also names Character.AI’s founders, Google, and Alphabet as defendants.
Kerala Battles Deadly Brain-Eating Amoeba as Cases Rise