NOTE: This article contains descriptions that some might find disturbing. Please read at your own discretion.
A federal lawsuit has been filed against the social media giant TikTok, alleging the company was negligent in not protecting employees from emotional trauma caused by viewing “highly toxic and extremely disturbing images in the workplace.”
The lawsuit, which is seeking class action status, was filed in California on March 24 by two former TikTok contractors who worked as content moderators. TikTok parent company ByteDance is also listed as a defendant in the lawsuit.
Plaintiffs Ashley Velez and Reece Young claim much of the content they were tasked to moderate included “child sex abuse, rape, torture, bestiality, beheadings, suicide, and murder.”
The Chinese-owned social media platform TikTok is a massive hub for short-form video sharing, with more than 1 billion monthly active users as of September 2021.
TikTok has not responded publicly to these accusations as of this writing. In a previous lawsuit (now dropped), a company spokesperson said that TikTok does not comment on ongoing litigation.
According to the lawsuit, as many as 90 million videos are uploaded to TikTok in a single day, many of which contain disturbing content. In the second quarter of 2021, for example, TikTok removed at least 8.1 million videos for violating community guidelines.
Get breaking National news
Though Velez and Young were hired as contractors — by Telus International (not affiliated with the telecommunications company Telus) and Atrium Staffing Services, respectively — they allege the moderator guidelines and quotas they followed were set by TikTok.
In the lawsuit, Young alleges she watched a video of a “thirteen-year-old child being executed by cartel members.”
The plaintiffs also allege moderators face repeated exposure to conspiracy theories, “including but not limited to suggestions that the COVID-19 pandemic is a fraud, the distortions of historical facts such as Holocaust denials, ‘challenges’ that involve high-risk behavior, fringe beliefs, hate speech, and political disinformation.”
This type of content, the lawsuit claims, can cause “traumatic reactions.”
Young and Velez, represented by lawyers at Joseph Saveri Law Firm, LLP, claim they worked 12-hour days (with two 15-minute breaks and one hour-long lunch break) reviewing and moderating TikTok content to prevent graphic images from reaching the app’s users.
Both Young and Velez began their employment in 2021, with Young employed for 11 months and Velez for seven.
The plaintiffs have “suffered immense stress and psychological harm,” and have sought counselling on their own time and effort, the lawsuit alleges. The lawsuit claims this is because TikTok did not provide adequate support measures before and after viewing graphic content.
Rather than helping content moderators deal with workplace stress and trauma, the lawsuit claims TikTok imposes “productivity standards and quotas on their content moderators that are irreconcilable with applicable standards of care.”
Young and Velez argue they had to meet strict quantity and accuracy quotas while employed as content moderators, requiring they view videos for no more than 25 seconds while maintaining an accuracy rate of 80 per cent.
In order to meet quotas, the lawsuit alleges moderators often review multiple videos at the same time.
“We would see death and graphic, graphic pornography. I would see nude underage children every day,” Velez told NPR. “I would see people get shot in the face, and another video of a kid getting beaten made me cry for two hours straight.”
This is not the first time TikTok has been sued by a content moderator. In December 2021, another former employee sued the social media platform after she “developed psychological trauma as a result of her job.” NPR reported this lawsuit was dropped last month, before a settlement could be reached and after her lawyers say she was fired.
However, should the lawsuit filed by Young and Velez become a class action, it will follow the steps of a landmark case against Facebook (now Meta Platforms), which reached a US$52-million settlement in May 2020.
That lawsuit alleged content moderators employed by Meta developed mental health issues stemming from the extensive graphic violence they viewed at work. Individual moderators in this case were paid a minimum of US$1,000 each, depending on the severity of emotional distress stemming from their employment.
Comments