TikTok Faces Legal Action Over Cuts to UK Online Security Teams
TikTok is facing legal action over cuts to its UK online security teams. The social media company announced in August that more than 400 workers would lose their jobs, with AI replacing some of the workforce and other jobs being re-hired overseas.
Background of the Issue
TikTok is accused of threatening these security guards with dismissal days before the vote to form a union. Two moderators have sent a legal letter to TikTok outlining the terms of a potential lawsuit for unlawful interference and automatic unfair dismissal. Unlawful discrimination occurs when an employer treats an employee unfairly because the employee has exercised a protected employment right.
Reaction from Moderators and Supporters
Stella Caram, general counsel at Foxglove, a nonprofit that supports the moderators, said: "In June, TikTok announced it was hiring hundreds more content moderators, and two months later they fired them all. What has changed? Workers have exercised their legal right to try to form a union. This is blatant, blatant and illegal union busting." TikTok was given one month to respond to the legal claim.
Response from TikTok
A TikTok spokesperson said: "We once again strongly reject this baseless claim. These changes were part of a broader global reorganization as we continue to evolve our global operating model for trust and security leveraging technological advances to continue to maximize security for our users."
Concerns Over User Safety
Three whistleblowers told Sky News the cuts would put British users at risk, a claim repeated by Julio Miguel Franco, one of the moderators behind the lawsuit. "TikTok needs to tell the truth. When people say that AI can do our job of keeping people safe on TikTok, you know that’s nonsense. Instead, they want to steal our jobs and send them to other countries where they can pay people less and treat them worse. The end result is that TikTok will become less safe for everyone."
Internal Documents and Previous Statements
Internal documents show TikTok planned to keep its human moderators in London for at least the rest of 2025. The documents outline the increasing need for dedicated moderators due to the growing scale and complexity of facilitation. TikTok’s governance chief Ali Law also told MPs in February that "human moderators…must use their nuances, skills and training" to moderate hateful behavior, misinformation and misleading information.
Ongoing Debate
Following a series of letters between TikTok and MPs, Dame Chi Onwurah, chair of the Science and Technology Select Committee, said she was “deeply” concerned about the cuts. "There is a real danger to the lives of TikTok users," she said. Last month, Mr Law said users’ safety would not be compromised. "We set high standards when it comes to introducing new moderation technologies. In particular, we are ensuring that we ensure that the output of existing moderation processes is met or even exceeded by anything we do on a new basis."
