Two former TikTok employees are suing the company over the emotional toll of seeing graphic content on the app, alleging that TikTok has “failed to provide a safe workplace for the thousands of contractors who are the gatekeepers between the unfiltered, disgusting and offensive content uploaded to the App and the hundreds of millions of people who use the App every day.”

Ashley Velez and Reece Young both worked as content moderators for TikTok after being recruited by outside staffing agencies, and were classified by the company as independent contractors. Both women allege that during their time with the company, they saw “graphic and objectionable content including child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.” They also allege that TikTok forced them to watch such videos in high volumes to meet productivity standards and failed to “implement acknowledged standards of care to protect content moderators from harm. 

According to the lawsuit filed in the Northern District of California, the plaintiffs put in grueling 12-hour days to moderate content for TikTok, which is owned by the Chinese company ByteDance and has more than a billion monthly active users. They were one of 10,000 content moderators tasked with ensuring that the uploads to the app fell within TikTok community guidelines. More than 81 million videos were removed from the app for violating community guidelines during the second quarter of 2021 alone, according to the lawsuit.


During the course of their time as TikTok content moderators, the lawsuit alleges, Velez, who worked for the company from May to November 2021, encountered such content as “bestiality and necrophilia, violence against children, and other distressing imagery.” Young, who worked for the company for eleven months starting in early 2021, saw disturbing images such as a thirteen-year-old child being executed by cartel members, as well as bestiality videos.

“We would see death and graphic, graphic pornography. I would see nude underage children every day,” Velez told NPR. “I would see people get shot in the face, and another video of a kid getting beaten made me cry for two hours straight.”

Both women allege that they felt pressured to hit their quota targets and that TikTok did not provide the appropriate mental health counseling for content moderators who saw such images, thereby increasing the risk for developing such mental health conditions as PTSD. They also allege that TikTok made them sign non-disclosure agreements at the start of their employment, thereby preventing them from discussing what they saw on the job with their loved ones.

Most major tech startups like TikTok employ independent contractors to moderate content that violates community guidelines, and TikTok is not the only company to be accused of failing to provide appropriate safeguards for its employees. In 2020, for instance, Facebook agreed to pay its content moderators $20 million following a class-action lawsuit, with moderators named in the lawsuit alleging that they had developed PTSD as a result of their work reviewing graphic and violent content.

TikTok did not immediately respond to a request for comment.