Published Date : 23/08/2025
TikTok is planning to lay off hundreds of staff in the UK who moderate the content that appears on the social media platform. According to TikTok, the plan would see work moved to its other offices in Europe as it invests in the use of artificial intelligence (AI) to scale up its moderation efforts.
We are continuing a reorganisation that we started last year to strengthen our global operating model for Trust and Safety, which includes concentrating our operations in fewer locations globally, a TikTok spokesperson told the BBC. However, a spokesperson for the Communication Workers Union (CWU) said the decision was putting corporate greed over the safety of workers and the public.
TikTok workers have long been sounding the alarm over the real-world costs of cutting human moderation teams in favour of hastily developed, immature AI alternatives, CWU National Officer for Tech John Chadfield stated. He added that the cuts had been announced just as the company’s workers are about to vote on having their union recognised.
But TikTok said it would maximize effectiveness and speed as we evolve this critical function for the company with the benefit of technological advancements. Impacted staff work in its Trust and Safety team in London, as well as hundreds more workers in the same department in parts of Asia. TikTok uses a combination of automated systems and human moderators. According to the firm, 85% of posts which break the rules are removed by its automated systems, including AI.
According to the firm, this investment is helping to reduce how often human reviewers are exposed to distressing footage. Affected staff will be able to apply to other internal roles and will be given priority if they meet the job's minimum requirements.
The move comes at a time when the UK has increased the requirements of companies to check the content which appears on their platforms, and particularly the age of those viewing it. The Online Safety Act came into force in July, bringing with it potential fines of up to 10% of a business' total global turnover for non-compliance. TikTok brought in new parental controls that month, which allowed parents to block specific accounts from interacting with their child, as well as giving them more information about the privacy settings their older teenagers are using.
However, it has also faced criticism in the UK for not doing enough, with the UK data watchdog launching what it called a major investigation into the firm in March. TikTok told the BBC at the time its recommender systems operated under strict and comprehensive measures that protect the privacy and safety of teens.
Q: Why is TikTok laying off content moderators in the UK?
A: TikTok is laying off content moderators in the UK as part of a reorganization to strengthen its global operating model for Trust and Safety. The company is shifting towards more AI-driven content moderation to improve efficiency and speed.
Q: How many staff will be affected by these layoffs?
A: Hundreds of staff in the UK and parts of Asia who work in TikTok's Trust and Safety team will be affected by these layoffs.
Q: What is the Communication Workers Union's (CWU) stance on these layoffs?
A: The CWU criticizes TikTok for prioritizing corporate greed over the safety of workers and the public. They argue that cutting human moderation teams in favor of immature AI alternatives is risky.
Q: What measures is TikTok taking to support affected staff?
A: Affected staff will be able to apply to other internal roles and will be given priority if they meet the job's minimum requirements.
Q: What is the Online Safety Act, and how does it affect TikTok?
A: The Online Safety Act, which came into force in July, increases the requirements for companies to check the content on their platforms, especially regarding the age of users. Non-compliance can result in fines of up to 10% of a business's total global turnover.