The Federal Trade Commission (FTC) and the Los Angeles District Attorney’s Office have banned the anonymous messaging app "NGL: ask me anything" from being used by anyone under 18. This decisive action follows allegations that the app was inappropriately marketed to minors and facilitated cyberbullying and harassment. The settlement, announced... Read More »
Teen Girl Bring Class Lawsuit Against Snapchat for Sexual Exploitation
A 12-year-old girl on Snapchat shared nude photos with a stranger, who told her she was pretty and also her friend. She believed him, and also believed Snapchat that the photos and videos on the app would disappear, leaving no trace. She was wrong on both counts.
After being complimented and also pressured by her new “friend,” the young girl agreed to share her images and videos. He then released and shared on the internet hundreds of her intimate photos and videos, taken when she was 12 - 13 years old, without her knowledge.
The adult man, who was an active-duty Marine at the time, saved her nude Snapchat photos and videos and published them on the internet. The Marine was convicted in 2021 of child pornography and sexual abuse charges in a military court.
Snapchat, with over 300 million active users, has been widely believed by users to allow anyone to share images and content safely. Snapchat has long touted its safe, core features that provide for the self-deletion of both messages and images, including video chats, within 24 hours.
Now, at 16, the young teen is the creator of a class-action suit against Snapchat, saying the app is not only extremely popular with her peers but also does nothing to stop the sexual exploitation of young girls before it happens.
The anonymous teenager filed the class-action suit on Monday, May 2. In court documents, attorneys say Snapchat depends upon “an inherently reactive approach that waits until a child is harmed and places the burden on the child to voluntarily report their own abuse. These tools and policies are more effective in making these companies wealthier than [in] protecting the children and teens who use them.”
The lawsuit also names Google and Apple as defendants. Court papers state these two giant tech platforms are listed as defendants because they host ‘Chitter,’ the App the Marine used to share the young girl’s photos and images. As of today, Apple and Google have stopped hosting the ‘Chitter’ app on their platforms.
Fred Sainz, an Apple spokesperson, told the Washington Post in a statement that ‘Chitter’ had often breached Apple’s rules for “proper moderation of all user-generated content.”
José Castañeda, a Google spokesperson, said they are “deeply committed to fighting online child sexual exploitation.”
The lawsuit seeks a minimum of $5 million in damages, plus guarantees that Snap, the owner of Snapchat, will create more protection for users. Attorney Juyoun Han, who represents the young girl, released a statement, saying, “We cannot expect the same companies that benefit from children being harmed to go and protect them. That’s what the law is for.”
This is not the first time Snapchat has come under legal fire for messages and images that did not delete as promised. Snapchat agreed to settle charges made by the Federal Trade Commission in 2014 about the “disappearing nature” of their photos and videos, and more.
At the time, FTC Chairwoman Edith Ramirez said the company was not living up to its promises.
“If a company markets privacy and security as key selling points in pitching its service to consumers, it is critical that it keep those promises,” said Ramirez. “Any company that makes misrepresentations to consumers about its privacy and security practices risks FTC action.”
Snapchat was founded in 2011 and grew extremely popular quickly, with its unique App that offers messages that vanish. The App is considered the most popular social media platform among teenagers, beating out all others including Instagram, Twitter, and Facebook. A Data Reports 2021 report states six out of ten Snapchat users, about 59 percent, are teens from 13 years old to young adults at 24.
The Children’s Online Privacy Protection Act (COPPA) mandates that all firms that track or target users may not allow anyone under 13 years old to use their app. Although the Snapchat rules are that a user must be at least 13 years old, as COPPA requires, there is no proof of age verification in place.
Related Articles
TikTok is facing another lawsuit, this time from the Utah Division of Consumer Protection, that alleges that the popular video-sharing app "created a virtual strip club" that made it possible for young users to be sexually exploited. The complaint argues that adults on the app were allowed to send virtual... Read More »
Families are taking the fight against drugs to the courtroom as over 25 lawsuits have been filed accusing popular social media platform Snapchat and its parent company, Snap Inc., of failing to do more to stop the illegal distribution of drugs to vulnerable teens and young adults. One lawsuit leading... Read More »
Last week, a California federal appeals court deviated from the norm in its ruling in a case involving Snapchat and the parents of three individuals who died after using the app's popular “speed filter.” On the evening of May 28, 2017, 17-year-old Jason Davis was behind the wheel of a... Read More »