TikTok has placed hundreds of UK jobs at risk as it increasingly uses AI to moderate content.
The layoffs will be within its trust and safety operations and come as part of a global restructure. Currently more than 85% of content that is taken down for violating TikTok community guidelines is flagged by AI.
Under the proposed plans, some work will be reallocated to other offices in Europe and some third-party providers. TikTok - which allows users to create, watch and share short-form videos - currently employs more than 2,500 staff in the UK, with its head office in Farringdon in London.
READ MORE: DWP confirms date as bank accounts to be checked
READ MORE: Major change for anyone who buys National Lottery tickets announced
Job roles in south and south-east Asia will also be affected by the restructure. TikTok argues that using AI reduces the amount of distressing or graphic content that its moderation teams are exposed to.
A spokesman for the social media firm said: “We are continuing a reorganisation that we started last year to strengthen our global operating model for trust and safety, which includes concentrating our operations in fewer locations globally to ensure that we maximise effectiveness and speed as we evolve this critical function for the company with the benefit of technological advancements.”
A spokesperson for the Communication Workers Union said: “This news will put TikTok’s millions of British users at risk.
“TikTok workers have long been sounding the alarm over the real-world costs of cutting human moderation teams in favour of hastily developed, immature AI alternatives.
“This has been a constant concern throughout the process of TikTok workers’ efforts to form a union.”
The update from TikTok comes after the new Online Safety Act came into force last month. The new rules mean online platforms have a greater responsibility to prevent children accessing harmful content such as pornography or material that encourages suicide.
This could include changing algorithms to filter out harmful content, introducing age verification checks and removing harmful content quickly.
Failure to comply could result in businesses being fined £18million or 10% of their global revenues - whichever is higher - or their executives being jailed. The rules are implemented and enforced by Ofcom.
You may also like
What is the Suhail star? How and when to spot it in Saudi Arabia, everything you need to know
Conor Maynard paternity revealed as Traitors' Charlotte Chilton 'demands second test'
Rebel Kid Apoorva Mukhija replies with 'truth' after ex Utsav Dahiya's cheating song goes viral
Police launch probe into Maoist killing of youth over Tricolour hoisting
The color of urine is changing, is there a stone in the gall bladder?