TikTok is facing another lawsuit, this time from the Utah Division of Consumer Protection, that alleges that the popular video-sharing app "created a virtual strip club" that made it possible for young users to be sexually exploited. The complaint argues that adults on the app were allowed to send virtual... Read More »
Former Tik-Tok Content Moderators Sue the Video Sharing App Over Emotional Toll Caused by Graphic Videos
Two former Tik-Tok moderators are suing the popular video-sharing app and its parent company, ByteDance. The federal lawsuit, which is seeking class-action status, accuses the Chinese-owned app of violating California labor laws for not offering proper mental and emotional support for its content moderators.
The plaintiffs named in the suit are Reece Young and Ashley Velez. Velez, the mother of two boys, began her job with the company in May 2021 as a content moderator. During her time with the company, she was tasked with the duty of viewing hundreds of videos that were not in line with the app's video posting guidelines. The lawsuit details that when she was hired on she was told that she would be “the front line of defense from protecting children from seeing violence.” Her lawsuit details “Plaintiff Velez saw bestiality and necrophilia, violence against children, and other distressing imagery.” She shared during an interview that she saw “death and graphic, graphic pornography.” She explained that on a daily basis, she was exposed to videos of nude children and people getting shot in the face. In one instance, Velez detailed seeing a child getting beaten which left her crying for two hours straight. Despite the graphic footage, Velez claims that the tech company did little to support her and other content moderators who had to view such graphic footage on a daily basis for hours at a time.
In the complaint, Velez and Young accuse the tech giant of not having the necessary protocols in place to protect employees who moderate graphic content. The suit accuses the company of violating California's labor laws because the company was aware of the trauma that comes from viewing such graphic videos at high volumes but did not have proper measures in place to help their employees deal with the mental stress.
“Defendants are aware of the negative psychological effects that viewing graphic and objectionable content has on content moderators. Despite this knowledge, Defendants fail to implement acknowledged standards of care to protect content moderators from harm,” the lawsuit reads. “To the contrary, Defendants impose productivity standards and quotas on their content moderators that are irreconcilable with applicable standards of care.”
The suit goes on to accuse the tech company of exposing employees to abnormally dangerous activities because they required moderators to view excessive amounts of graphic footage, often up to 12 hours a day.
According to the lawsuit, Young and Velez were not direct employees of Tik-Tok. Instead, they were contract workers who were employed by third-party agencies. As part of their agreement, Tik-Tok was in control of the day-to-day operations and duties Young and Velez were responsible for. As part of their workflow, Tik-Tok had a system in place which tied contractor pay directly to quotas, a system that encouraged contractors to hit excessive quota targets.
Additionally, Young and Velez detail that before they started their roles as moderators, they were required to sign a non-disclosure agreement in order to keep them from discussing the footage that they viewed while on the job.
Velez and Young contend in their complaint that because of their exposure to the massive amount of violent and graphic content, they suffered “immense stress and psychological harm.” The former employees even sought counseling after work hours and at their own expense in order to deal with the psychological toll caused by the footage.
This lawsuit is not the first of its kind. Within the past couple of years, content moderators from different social media platforms have come forward accusing employers of not providing a safe work environment. These complaints argue that adequate support measures are not in place for content moderators who view graphic and violent footage in copious amounts.
In May 2020, a similar suit was settled with tech giant Facebook after the lawyers representing Velez and Young filed a lawsuit on behalf of thousands of moderators who viewed content for the social media platform. In that case, Facebook agreed to a $52 million settlement. The settlement gave former Facebook content moderators at least $1,000 in relief.
Related Articles
Social media platform TikTok is being accused of not taking enough measures to ensure the safety of child users, among other allegations made by the state of Utah in a newly filed lawsuit. Utah is the latest state to take legal action against the popular app, following similar action from... Read More »
Nylah was only ten years old when she entered the popular “Blackout Challenge” posted on TikTok. The contest, available to over a billion people worldwide, asks viewers to use a household item to try to strangle themselves and to encourage others to do the same. After an unsuccessful, hidden attempt... Read More »
Several lawsuits have been filed against popular video sharing app TikTok after two children reportedly died while attempting the platform’s trending “Blackout Challenge.” As part of the challenge, children try to choke themselves unconscious and record themselves while doing so. In the latest lawsuit filed, the family of 9-year-old Arriani... Read More »