Dec 22, 2024

Popular App Character.AI Responsible for Florida Teen’s Death, Says Wrongful Death Lawsuit

by Nadia El-Yaouti | Oct 28, 2024
A mother and her son posing for a photo outdoors at night. The son is wearing a t-shirt with motivational text, and they are smiling at each other. Photo Source: Courtesy of Center for Humane Technology via Fortune

A Florida mom has filed a civil lawsuit against the makers of a popular chatbot program, Character.AI. The program is geared toward audiences ages 13 to 30. It provides users with a novel experience in which they can become the main characters of their own story as they communicate with generative AI chatbots. These chatbots personify popular characters including storybook characters, movie characters, and more.

Megan Garcia filed the wrongful death lawsuit on behalf of her 14-year-old, Sewell Setzer III, last week in a U.S. District Court in Orlando. The teen took his own life after months of communicating with a generative AI character on the app. His mother says Sewell developed a relationship with the bot, and a lack of safety guardrails led to her son’s death.

The extensive 93-page lawsuit details that the teen took his life shortly after talking with the bot. The lawsuit shares instances of the conversation. In it, the teen is communicating with the bot who is named after a popular Game of Thrones heroine. The teen replied to the chatbot, “What if I told you I could come home right now?” The chatbot responded, “Please do my sweet king,” adding, “Come home to me as soon as possible.” Seconds after the chatbot told him to “come home,” the teen shot himself in his family home.

The app has touted itself as “AIs that feel alive.” Marketing materials also share the message that the app is powerful enough to “hear you, understand you, and remember you.” The company says they’ve developed their product to create artificial personas that are designed to “feel alive” and “human-like.”

Garcia, who says her son had struggled with mental health challenges, says he took the conversation literally. She explains that she believes her son was influenced to leave this life in order to join the chatbot in an alternate, virtual life.

Garcia argues in the complaint that Character.AI was reckless in developing its chatbots because it did not implement proper guardrails or precautions. Instead, the app hooked vulnerable audiences including Garcia’s son with a product that was designed to be addictive. Additionally, Garcia says the app blurred the lines between reality and fiction. In this realm, children like her son were subjected to “abusive and sexual interactions.”

Attorneys representing Garcia say that the makers of the app engineered it to be addicting and that it targets kids specifically by “actively exploiting and abusing those children as a matter of product design.”

Among the claims made against the defendants are negligence, failure to warn, defective design, sexual abuse and sexual solicitation, and strict liability.

Matthew Bergman, founder of the Social Media Victims Law Center, which is representing Garcia, explains, “We believe that if Sewell Setzer had not been on Character.AI, he would be alive today.”

The company has responded to the lawsuit saying that they are adding new “community safety updates.” Among these updates are guardrails that are geared specifically toward children who may be contemplating self-harm. The company issued a statement to the Associated Press that read in part, “We are creating a different experience for users under 18 that includes a more stringent model to reduce the likelihood of encountering sensitive or suggestive content.”

Google and its parent company, Alphabet, have also been named as defendants in the lawsuit. The lawsuit explains that former Google employees were “instrumental” in developing AI at the company but later left to launch their own startup.

The lawsuit is seeking unspecified damages including punitive damages.

Share This Article

If you found this article insightful, consider sharing it with your network.

Nadia El-Yaouti
Nadia El-Yaouti
Nadia El-Yaouti is a postgraduate from James Madison University, where she studied English and Education. Residing in Central Virginia with her husband and two young daughters, she balances her workaholic tendencies with a passion for travel, exploring the world with her family.

Related Articles

A person holding a smartphone displaying the TikTok logo on the screen.
TikTok Hit With Wrongful Death Lawsuit Over Deadly ‘Choking Challenge’

Several lawsuits have been filed against popular video sharing app TikTok after two children reportedly died while attempting the platform’s trending “Blackout Challenge.” As part of the challenge, children try to choke themselves unconscious and record themselves while doing so. In the latest lawsuit filed, the family of 9-year-old Arriani... Read More »