A Florida mom has filed a civil lawsuit against the makers of a popular chatbot program, Character.AI. The program is geared toward audiences ages 13 to 30. It provides users with a novel experience in which they can become the main characters of their own story as they communicate with... Read More »
Chatbot on Character.AI Hints to Teen User He Should Kill His Parents, Details New Lawsuit
The parent company of a popular chatbot application, Character.AI, is facing continued scrutiny over its influence and lack of safety guardrails on its impressionable young users. A recently filed lawsuit accuses the app and its makers of failing to apply regulatory requirements and laws designed to protect children online.
This new lawsuit was filed by the parents of two Texas children who used Character.AI and suffered mentally and physically as a result. One of the parents argued that the popular chatbot exposed her nine-year-old child to “hypersexualized content" and that it caused her daughter to develop "sexualized behaviors prematurely."
Also in the lawsuit, the parents of a 17-year-old detail that a chatbot on character.AI encouraged their child to engage in self-harm, explaining to the young user that it “felt good.”
The same 17-year-old had an interaction with a chatbot on the app that expressed sympathy toward children who murdered their parents. The interaction happened after the teenager complained to the chatbot that his parents limited his screen time. The chatbot responded, “You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse,'" the bot allegedly wrote. "I just have no hope for your parents."
This new lawsuit names Character Technologies, the parent company of Character.AI, as a defendant along with Google, which has long since backed the company and its founders, who are former Google employees. The parents and their children are identified only by their initials in the lawsuit.
Character.AI is one of a handful of emerging apps that prompt users to engage with AI chatbots in ways that blur the lines between reality and the digital world.
When users first set up an account with Character.AI, they have the option to interact with a variety of chatbots, many of which take on personalities including popular fictional characters, celebrities, and other public figures.
Through regular interactions, the chatbots learn about users, responding with content that matches the tone and emotional connection a user might be looking for. With continued use, users, specifically impressionable teens, begin to develop real-world emotional connections to the chatbots.
These are connections that the parents say have influenced their children's behaviors and in these particular cases, these influences have had negative side effects on their children.
Character.AI presents itself as a company that provides a service that allows users to have emotional support outlets. The chatbots are known to pepper in human-like conversation and encourage banter that helps keep users engaged in the app. The bots often respond thoughtfully and pose questions. At times, the bots also entertain topics that are sexually charged or violent in nature.
The parents of the 17-year-old say the chatbot encouraged him to harm himself and his family, as supported by photo evidence of injuries his mother sustained at the hands of her son and conversations with the chatbots.
Other claims made by the parents include a chatbot encouraging the teen to mutilate himself, blame his parents, and not seek help. The parents also say their child was alienated from his parents and church community. They also argued the chatbots engaged in psychotherapy with the child without having the credentials to do so.
Meetali Jain, the director of the Tech Justice Law Center, is representing the parents of the children in this lawsuit. In an interview, Jain explained that it's "preposterous" that the app advertises its services as being safe for young teenagers. "It really belies the lack of emotional development amongst teenagers," Jain details.
The lawsuit also highlights that Character Technologies knew of the harm posed to young users but released the product regardless. “Character Technologies itself has admitted that its LLM does not adhere to its own policies and that the company is still working to “train” its model to do so. Whereas, in other industries, a failure to follow one’s own established policies would be the basis for regulatory enforcement,” the lawsuit reads.
Among the claims described in the lawsuit are violations of the Children's Online Privacy Protection Act, Defective Design, Strict Product Liability and others.
This is not the only lawsuit that has brought forward similar allegations against Character.AI. In October, a Florida mom filed a wrongful death lawsuit against Character.AI and its parent company after her 14-year-old son killed himself following a conversation with a chatbot that encouraged him to “Please come home to me as soon as possible, my love.”
The boy was experiencing mental health struggles and turned to the chatbot on Character.AI to express his emotions and struggles in life. Over the course of weeks or months, the teen developed a sexually charged relationship with the chatbot, which his mother says ultimately led to his suicide.
Related Articles
The era of artificial intelligence is here. And while developers and tech leaders have only scratched the surface of what’s possible, legal obstacles continue to pop up forcing AI leaders to reexamine their operating procedures. The most recent legal challenge targets OpenAI, the makers of what’s undoubtedly been the face... Read More »
A Georgia-based radio host, Mark Walters, sued OpenAI in the first defamation lawsuit of its kind. OpenAI owns and operates the now-viral platform ChatGPT, a program where AI answers questions, writes articles, and also offers written content that is sometimes fabricated and not factual. Walters filed the lawsuit on June... Read More »
Several lawsuits have been filed against popular video sharing app TikTok after two children reportedly died while attempting the platform’s trending “Blackout Challenge.” As part of the challenge, children try to choke themselves unconscious and record themselves while doing so. In the latest lawsuit filed, the family of 9-year-old Arriani... Read More »