The era of artificial intelligence is here. And while developers and tech leaders have only scratched the surface of what’s possible, legal obstacles continue to pop up forcing AI leaders to reexamine their operating procedures. The most recent legal challenge targets OpenAI, the makers of what’s undoubtedly been the face... Read More »
Popular App Character.AI Responsible for Florida Teen’s Death, Says Wrongful Death Lawsuit
A Florida mom has filed a civil lawsuit against the makers of a popular chatbot program, Character.AI. The program is geared toward audiences ages 13 to 30. It provides users with a novel experience in which they can become the main characters of their own story as they communicate with generative AI chatbots. These chatbots personify popular characters including storybook characters, movie characters, and more.
Megan Garcia filed the wrongful death lawsuit on behalf of her 14-year-old, Sewell Setzer III, last week in a U.S. District Court in Orlando. The teen took his own life after months of communicating with a generative AI character on the app. His mother says Sewell developed a relationship with the bot, and a lack of safety guardrails led to her son’s death.
The extensive 93-page lawsuit details that the teen took his life shortly after talking with the bot. The lawsuit shares instances of the conversation. In it, the teen is communicating with the bot who is named after a popular Game of Thrones heroine. The teen replied to the chatbot, “What if I told you I could come home right now?” The chatbot responded, “Please do my sweet king,” adding, “Come home to me as soon as possible.” Seconds after the chatbot told him to “come home,” the teen shot himself in his family home.
The app has touted itself as “AIs that feel alive.” Marketing materials also share the message that the app is powerful enough to “hear you, understand you, and remember you.” The company says they’ve developed their product to create artificial personas that are designed to “feel alive” and “human-like.”
Garcia, who says her son had struggled with mental health challenges, says he took the conversation literally. She explains that she believes her son was influenced to leave this life in order to join the chatbot in an alternate, virtual life.
Garcia argues in the complaint that Character.AI was reckless in developing its chatbots because it did not implement proper guardrails or precautions. Instead, the app hooked vulnerable audiences including Garcia’s son with a product that was designed to be addictive. Additionally, Garcia says the app blurred the lines between reality and fiction. In this realm, children like her son were subjected to “abusive and sexual interactions.”
Attorneys representing Garcia say that the makers of the app engineered it to be addicting and that it targets kids specifically by “actively exploiting and abusing those children as a matter of product design.”
Among the claims made against the defendants are negligence, failure to warn, defective design, sexual abuse and sexual solicitation, and strict liability.
Matthew Bergman, founder of the Social Media Victims Law Center, which is representing Garcia, explains, “We believe that if Sewell Setzer had not been on Character.AI, he would be alive today.”
The company has responded to the lawsuit saying that they are adding new “community safety updates.” Among these updates are guardrails that are geared specifically toward children who may be contemplating self-harm. The company issued a statement to the Associated Press that read in part, “We are creating a different experience for users under 18 that includes a more stringent model to reduce the likelihood of encountering sensitive or suggestive content.”
Google and its parent company, Alphabet, have also been named as defendants in the lawsuit. The lawsuit explains that former Google employees were “instrumental” in developing AI at the company but later left to launch their own startup.
The lawsuit is seeking unspecified damages including punitive damages.
Related Articles
A Georgia-based radio host, Mark Walters, sued OpenAI in the first defamation lawsuit of its kind. OpenAI owns and operates the now-viral platform ChatGPT, a program where AI answers questions, writes articles, and also offers written content that is sometimes fabricated and not factual. Walters filed the lawsuit on June... Read More »
Nylah was only ten years old when she entered the popular “Blackout Challenge” posted on TikTok. The contest, available to over a billion people worldwide, asks viewers to use a household item to try to strangle themselves and to encourage others to do the same. After an unsuccessful, hidden attempt... Read More »
Several lawsuits have been filed against popular video sharing app TikTok after two children reportedly died while attempting the platform’s trending “Blackout Challenge.” As part of the challenge, children try to choke themselves unconscious and record themselves while doing so. In the latest lawsuit filed, the family of 9-year-old Arriani... Read More »