Social media platform TikTok is being accused of not taking enough measures to ensure the safety of child users, among other allegations made by the state of Utah in a newly filed lawsuit. Utah is the latest state to take legal action against the popular app, following similar action from... Read More »
TikTok Not Responsible for Death of a 10-Year-Old Who Joined a “Blackout Challenge”
Nylah was only ten years old when she entered the popular “Blackout Challenge” posted on TikTok. The contest, available to over a billion people worldwide, asks viewers to use a household item to try to strangle themselves and to encourage others to do the same. After an unsuccessful, hidden attempt in her closet using a purse strap, Nylah blacked out. Her mother found her unconscious, and Nylah died after several attempts to revive her. Nylah’s mother sued TikTok for wrongful death, but a federal judge dismissed the suit.
Judge Paul S. Diamond of the United States District Court for the Eastern District of Pennsylvania granted defendants TikTok, Inc. et. al.’s motion to dismiss on October 25. His decision disputed plaintiff Taiwainna Anderson’s theory that TikTok was liable because it was the “publisher” of the third-party content it posted. Diamond disagreed and wrote that TikTok was immune from suit under the Communications Decency Act (CDA).
The CDA, enacted in 1996, prohibits individuals from “knowingly transmitting obscene or indecent messages to recipients under the age of 18.” According to the Electronic Frontier Foundation, Section 230, a key provision of the Act, is now “one of the most valuable tools for protection of freedom of expression and innovation on the Internet.” The section says, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Diamond’s decision began with an explanation of TikTok. He wrote that it is a social media platform that lets viewers create and share short videos. He cites data that shows that 28% of TikTok’s users are under 18. He notes that one of TikTok’s most popular features is a “For You Page,” which uses algorithms to identify age, location and subject videos that will be popular with each specific user. When Nylah opened her TikTok in December 2021, she saw the “Blackout Challenge,” which tragically caused her death. Several other children also died during the Challenge.
Anderson’s lawsuit cites several causes of action. She includes design defects, failure to warn under strict liability, and negligence as well as Pennsylvania’s Unfair Trade Practices and Consumer Protection Laws and California’s Legal Remedies Act. TikTok moved to dismiss all of them, citing the CDA and arguing that Anderson failed to state a valid claim for relief. Judge Diamond’s decision only covered claims related to the products liability and negligence claims because those were “fully briefed” so he concluded that a decision on these claims was appropriate at this stage. He wrote that he would “dismiss the other claims later.”
Next Diamond turned to the legal standards that would govern his decision. Here he explained that dismissal would be permissible “only if Section 230 immunity is evident from the face of the complaint.” Deciding that it was, he said that Congress had “immunized providers’ decisions relating to the monitoring, screening, and deletion of content from (their) networks.” Those are actions regularly related to the role of publishers. He cited precedent that the Section was enacted “to maintain the robust nature of Internet communication and, accordingly, to keep government interference in the medium to a minimum.” He also noted that the “staggering amount” of information on the Internet made it “impossible” to prescreen each message.
Diamond wrote that Anderson’s claims, regardless of the “creative labeling” of her causes of action, all still require TikTok to be identified as a “publisher.” He said the only question he has to answer is “whether the duty that (Anderson) alleges (TikTok) violated derives from (TikTok’s) status of conduct as publisher.” All of Anderson’s allegations regarding how the algorithm delivered “dangerous and deadly” videos are based on the “defective manner” in which it “published a third party’s dangerous content.”
“TikTok’s actions were that of a publisher” he restated, concluding “Anderson’s claims are plainly barred by Section 230 immunity.”
Diamond concluded, “Nylah Anderson’s death was caused by her attempt to take up the “Blackout Challenge. Defendants did not create the Challenge; rather, they made it readily available on their site. Defendants’ algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it. In thus promoting the work of others, Defendants published that work –exactly the activity Section 230 shields from liability.”
His opinion did go on to question the “wisdom” of conferring that immunity. He will not be alone. According to NBC News, TikTok has announced that future searches for the “Blackout Challenge” will redirect viewers to a site that discusses the “dangerous nature” of some online challenges. Could that possibly be an adequate remedy?
This is a time when the behavior of TikTok, as well as Facebook, Instagram, Snapchat and other social media sites are questioned daily by media commentators and citizens. Social media plays a huge role in discussions of major issues that threaten not only children’s lives but democracy. Anderson v TikTok, Inc. should raise consciousness and stimulate discussion about increasing the limitations of CDA’s Section 230. But how likely is it to expect rich and powerful social media companies to sit quietly as others try to change the status quo?
Related Articles
Several lawsuits have been filed against popular video sharing app TikTok after two children reportedly died while attempting the platform’s trending “Blackout Challenge.” As part of the challenge, children try to choke themselves unconscious and record themselves while doing so. In the latest lawsuit filed, the family of 9-year-old Arriani... Read More »
Two former Tik-Tok moderators are suing the popular video-sharing app and its parent company, ByteDance. The federal lawsuit, which is seeking class-action status, accuses the Chinese-owned app of violating California labor laws for not offering proper mental and emotional support for its content moderators. The plaintiffs named in the suit... Read More »
Last week, a California federal appeals court deviated from the norm in its ruling in a case involving Snapchat and the parents of three individuals who died after using the app's popular “speed filter.” On the evening of May 28, 2017, 17-year-old Jason Davis was behind the wheel of a... Read More »