- Two TikTok content moderators are suing the company over a lack of support in flagging toxic images.
- Both work on TikTok’s proprietary system which monitors how many videos they watch.
- The suit adds to similar frustrations that tech giants are not protecting moderators from trauma.
Two former TikTok content moderators are suing the video-sharing app and its Beijing-based parent company, ByteDance, for imposing tough quotas and not providing enough mental health support to help them weed out “highly toxic and extremely disturbing images.”
Ashley Velez and Reece Young allege that TikTok has broken California labor laws by exposing them to serious harm and trauma, according to a lawsuit filed by Joseph Saveri Law Firm on March 24. They are seeking class-action status for the suit. This latest lawsuit against TikTok adds to growing calls from content moderators who say that social-media giants are not doing enough to protect them against the “worst of humanity.”