Nov 22, 2024

Teen Girl Bring Class Lawsuit Against Snapchat for Sexual Exploitation

by Diane Lilli | May 09, 2022
Image of a smartphone displaying the Snapchat logo on a yellow background. Photo Source: nikkimeel - stock.adobe.com

A 12-year-old girl on Snapchat shared nude photos with a stranger, who told her she was pretty and also her friend. She believed him, and also believed Snapchat that the photos and videos on the app would disappear, leaving no trace. She was wrong on both counts.

After being complimented and also pressured by her new “friend,” the young girl agreed to share her images and videos. He then released and shared on the internet hundreds of her intimate photos and videos, taken when she was 12 - 13 years old, without her knowledge.

The adult man, who was an active-duty Marine at the time, saved her nude Snapchat photos and videos and published them on the internet. The Marine was convicted in 2021 of child pornography and sexual abuse charges in a military court.

Snapchat, with over 300 million active users, has been widely believed by users to allow anyone to share images and content safely. Snapchat has long touted its safe, core features that provide for the self-deletion of both messages and images, including video chats, within 24 hours.

Now, at 16, the young teen is the creator of a class-action suit against Snapchat, saying the app is not only extremely popular with her peers but also does nothing to stop the sexual exploitation of young girls before it happens.

The anonymous teenager filed the class-action suit on Monday, May 2. In court documents, attorneys say Snapchat depends upon “an inherently reactive approach that waits until a child is harmed and places the burden on the child to voluntarily report their own abuse. These tools and policies are more effective in making these companies wealthier than [in] protecting the children and teens who use them.”

The lawsuit also names Google and Apple as defendants. Court papers state these two giant tech platforms are listed as defendants because they host ‘Chitter,’ the App the Marine used to share the young girl’s photos and images. As of today, Apple and Google have stopped hosting the ‘Chitter’ app on their platforms.

Fred Sainz, an Apple spokesperson, told the Washington Post in a statement that ‘Chitter’ had often breached Apple’s rules for “proper moderation of all user-generated content.”

José Castañeda, a Google spokesperson, said they are “deeply committed to fighting online child sexual exploitation.”

The lawsuit seeks a minimum of $5 million in damages, plus guarantees that Snap, the owner of Snapchat, will create more protection for users. Attorney Juyoun Han, who represents the young girl, released a statement, saying, “We cannot expect the same companies that benefit from children being harmed to go and protect them. That’s what the law is for.”

This is not the first time Snapchat has come under legal fire for messages and images that did not delete as promised. Snapchat agreed to settle charges made by the Federal Trade Commission in 2014 about the “disappearing nature” of their photos and videos, and more.

At the time, FTC Chairwoman Edith Ramirez said the company was not living up to its promises.

“If a company markets privacy and security as key selling points in pitching its service to consumers, it is critical that it keep those promises,” said Ramirez. “Any company that makes misrepresentations to consumers about its privacy and security practices risks FTC action.”

Snapchat was founded in 2011 and grew extremely popular quickly, with its unique App that offers messages that vanish. The App is considered the most popular social media platform among teenagers, beating out all others including Instagram, Twitter, and Facebook. A Data Reports 2021 report states six out of ten Snapchat users, about 59 percent, are teens from 13 years old to young adults at 24.

The Children’s Online Privacy Protection Act (COPPA) mandates that all firms that track or target users may not allow anyone under 13 years old to use their app. Although the Snapchat rules are that a user must be at least 13 years old, as COPPA requires, there is no proof of age verification in place.

Share This Article

If you found this article insightful, consider sharing it with your network.

Diane Lilli
Diane Lilli
Diane Lilli is an award-winning Journalist, Editor, and Author with over 18 years of experience contributing to New Jersey news outlets, both in print and online. Notably, she played a pivotal role in launching the first daily digital newspaper, Jersey Tomato Press, in 2005. Her work has been featured in various newspapers, journals, magazines, and literary publications across the nation. Diane is the proud recipient of the Shirley Chisholm Journalism Award.

Related Articles

Screenshot of the NGL anonymous messaging app on a smartphone, displaying its logo and download details.
FTC Bans Anonymous Messaging App NGL from Hosting Users Under 18

The Federal Trade Commission (FTC) and the Los Angeles District Attorney’s Office have banned the anonymous messaging app "NGL: ask me anything" from being used by anyone under 18. This decisive action follows allegations that the app was inappropriately marketed to minors and facilitated cyberbullying and harassment. The settlement, announced... Read More »

Image of the Snapchat logo, a recognizable ghost icon on a yellow background.
Families Sue Snapchat Parent Company Over Rising Fentanyl Deaths

Families are taking the fight against drugs to the courtroom as over 25 lawsuits have been filed accusing popular social media platform Snapchat and its parent company, Snap Inc., of failing to do more to stop the illegal distribution of drugs to vulnerable teens and young adults. One lawsuit leading... Read More »