Legal challenges surrounding generative artificial intelligence (AI) is increasing as actors prepare to join artists, authors, and publications in claiming AI firms' use of copyrighted works and personal data. This week, actress Scarlett Johansson threatened to sue OpenAI, accusing the company of imitating her voice without permission. On Monday, Johansson... Read More »
First Defamation Lawsuit for Open AI Filed, as Radio Host Sues for Defamation, Creating Test for Future AI Suits
A Georgia-based radio host, Mark Walters, sued OpenAI in the first defamation lawsuit of its kind. OpenAI owns and operates the now-viral platform ChatGPT, a program where AI answers questions, writes articles, and also offers written content that is sometimes fabricated and not factual.
Walters filed the lawsuit on June 5th in Georgia’s Superior Court of Gwinnett County. He is seeking unspecified monetary damages from OpenAI/ChatGPT.
In the lawsuit, Walters said ChatGPT stated as fact defamatory, false information, including false news that he had been accused of defrauding and embezzling funds from a non-profit organization.
ChatGPT replies to requests from submitters, often including queries from journalists, and in this case, reporter Fred Riehl asked the AI for information and was given a fictional reply.
Numerous users of Chat GPT have reported erroneous content created by the program. The major issue with content created by the AI system is that the programs cannot understand fact from fiction in many cases. For example, if a user submits a question asking for specific information in a leading manner, the AI system may just make up facts and historical fiction and present the reply as accurate data.
With many cases being reported about the new AI systems giving fictional replies to users, it is expected that many more lawsuits will follow. When ChatGPT offered false information about radio host Walters, it was harmful to his reputation. The fictional content created about Walters being involved in serious illegal breaches of the law such as defrauding and embezzling from a non-profit organization is daunting.
But does current US law cover a lawsuit such as Walters’, who is suing an AI system? These legal matters are in uncharted waters. Section 230 of the Communications Decency Act is famous for protecting internet companies from being liable for any content created by a third party and then shared or hosted on their sites. Will Section 230 be used in Chatbot systems’ lawsuits? While Section 230 protects internet providers from being sued as “publishers” of false information, it does not necessarily protect the original party that made the false statement, in this case ChatGPT and its owner Open AI.
In the agreement with ChatGPT, there is a disclaimer telling users that the program might “occasionally generate incorrect information.” However, the company also states that ChatGPT is very trustworthy and says it’s a tool to “get answers and learn something new.”
Liability lawsuits against AI systems such as ChatGPT are viable because creating and proliferating false, derogatory news about any person is illegal.
Court documents filed by Walters say journalist Riehl asked for a factual federal court case to be summarized and sent a PDF link to verify that case. However, ChatGPT replied with a made-up summary of the case, including both fiction and fact in some areas. For example, ChatGPT sent the journalist a document with false allegations against the radio host.
Some of the fictional allegations against Walters include that he embezzled funds from Second Amendment Foundation, a non-profit, in “excess of $5,000,000.” This is not factual but was created by ChatGPT and sent as fact to the reporter. Riehl did not publish the fake content created by the Chatbot, since he checked the data with a second source.
In a domino effect created by ChatGPT, which is not supposed to be able to use any PDFs or other external data to create new content, the AI did accept and use the outside information to create the false allegations in the information it provided.
For now, all eyes, including ChatGPT’s, will be on Walters’ lawsuit. He is seeking unspecified monetary damages.
Related Articles
The era of artificial intelligence is here. And while developers and tech leaders have only scratched the surface of what’s possible, legal obstacles continue to pop up forcing AI leaders to reexamine their operating procedures. The most recent legal challenge targets OpenAI, the makers of what’s undoubtedly been the face... Read More »
DoNotPay Inc, the maker of a one-of-a-kind legal service dubbed “The world's first robot lawyer,” is facing a proposed class action lawsuit that accuses the service of providing legal information without having proper legal credentials. The lawsuit argues that DoNotPay is in violation of California's unfair competition laws because it... Read More »
The rise of AI digital image-generating applications and software has made it easy for individuals and businesses alike to create masterfully crafted artwork in mere seconds. For some artists, however, these groundbreaking tools are nothing more than copyright violations against actual, human artists. As these AI image generators continue to... Read More »