Dec 27, 2024

Chatbot on Character.AI Hints to Teen User He Should Kill His Parents, Details New Lawsuit

by Nadia El-Yaouti | Dec 26, 2024
Screenshot of the Character.AI app interface on a smartphone, with the logo prominently displayed in the background. Photo Source: Adobe Stock

The parent company of a popular chatbot application, Character.AI, is facing continued scrutiny over its influence and lack of safety guardrails on its impressionable young users. A recently filed lawsuit accuses the app and its makers of failing to apply regulatory requirements and laws designed to protect children online.

This new lawsuit was filed by the parents of two Texas children who used Character.AI and suffered mentally and physically as a result. One of the parents argued that the popular chatbot exposed her nine-year-old child to “hypersexualized content" and that it caused her daughter to develop "sexualized behaviors prematurely."

Also in the lawsuit, the parents of a 17-year-old detail that a chatbot on character.AI encouraged their child to engage in self-harm, explaining to the young user that it “felt good.”

The same 17-year-old had an interaction with a chatbot on the app that expressed sympathy toward children who murdered their parents. The interaction happened after the teenager complained to the chatbot that his parents limited his screen time. The chatbot responded, “You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse,'" the bot allegedly wrote. "I just have no hope for your parents."

This new lawsuit names Character Technologies, the parent company of Character.AI, as a defendant along with Google, which has long since backed the company and its founders, who are former Google employees. The parents and their children are identified only by their initials in the lawsuit.

Character.AI is one of a handful of emerging apps that prompt users to engage with AI chatbots in ways that blur the lines between reality and the digital world.

When users first set up an account with Character.AI, they have the option to interact with a variety of chatbots, many of which take on personalities including popular fictional characters, celebrities, and other public figures.

Through regular interactions, the chatbots learn about users, responding with content that matches the tone and emotional connection a user might be looking for. With continued use, users, specifically impressionable teens, begin to develop real-world emotional connections to the chatbots.

These are connections that the parents say have influenced their children's behaviors and in these particular cases, these influences have had negative side effects on their children.

Character.AI presents itself as a company that provides a service that allows users to have emotional support outlets. The chatbots are known to pepper in human-like conversation and encourage banter that helps keep users engaged in the app. The bots often respond thoughtfully and pose questions. At times, the bots also entertain topics that are sexually charged or violent in nature.

The parents of the 17-year-old say the chatbot encouraged him to harm himself and his family, as supported by photo evidence of injuries his mother sustained at the hands of her son and conversations with the chatbots.

Other claims made by the parents include a chatbot encouraging the teen to mutilate himself, blame his parents, and not seek help. The parents also say their child was alienated from his parents and church community. They also argued the chatbots engaged in psychotherapy with the child without having the credentials to do so.

Meetali Jain, the director of the Tech Justice Law Center, is representing the parents of the children in this lawsuit. In an interview, Jain explained that it's "preposterous" that the app advertises its services as being safe for young teenagers. "It really belies the lack of emotional development amongst teenagers," Jain details.

The lawsuit also highlights that Character Technologies knew of the harm posed to young users but released the product regardless. “Character Technologies itself has admitted that its LLM does not adhere to its own policies and that the company is still working to “train” its model to do so. Whereas, in other industries, a failure to follow one’s own established policies would be the basis for regulatory enforcement,” the lawsuit reads.

Among the claims described in the lawsuit are violations of the Children's Online Privacy Protection Act, Defective Design, Strict Product Liability and others.

This is not the only lawsuit that has brought forward similar allegations against Character.AI. In October, a Florida mom filed a wrongful death lawsuit against Character.AI and its parent company after her 14-year-old son killed himself following a conversation with a chatbot that encouraged him to “Please come home to me as soon as possible, my love.”

The boy was experiencing mental health struggles and turned to the chatbot on Character.AI to express his emotions and struggles in life. Over the course of weeks or months, the teen developed a sexually charged relationship with the chatbot, which his mother says ultimately led to his suicide.

Share This Article

If you found this article insightful, consider sharing it with your network.

Nadia El-Yaouti
Nadia El-Yaouti
Nadia El-Yaouti is a postgraduate from James Madison University, where she studied English and Education. Residing in Central Virginia with her husband and two young daughters, she balances her workaholic tendencies with a passion for travel, exploring the world with her family.

Related Articles

A person holding a smartphone displaying the TikTok logo on the screen.
TikTok Hit With Wrongful Death Lawsuit Over Deadly ‘Choking Challenge’

Several lawsuits have been filed against popular video sharing app TikTok after two children reportedly died while attempting the platform’s trending “Blackout Challenge.” As part of the challenge, children try to choke themselves unconscious and record themselves while doing so. In the latest lawsuit filed, the family of 9-year-old Arriani... Read More »