TikTok Reduces Content Moderation Staff Amid Industry Changes

Thu 20th Feb, 2025

In a significant shift reflecting recent trends among major social media platforms, TikTok has announced a reduction in its content moderation workforce. This decision follows a similar move by Meta, the parent company of Facebook and Instagram, which recently ceased its fact-checking initiatives in the United States.

According to insider reports, TikTok is undergoing a restructuring process that will result in layoffs within its Trust and Safety department. While the exact number of positions affected remains uncertain, sources indicate that TikTok employs approximately 40,000 individuals globally in its content moderation division.

The timing of these layoffs is critical for TikTok, especially after the app faced scrutiny from the U.S. Congress last year regarding its commitment to youth safety. TikTok's CEO, Shou Zi Chew, previously stated that the company invests billions of dollars in content moderation. In October, TikTok had already laid off around 700 employees and shifted more responsibilities for monitoring its platform to artificial intelligence systems.

These recent developments occur against a backdrop of increasing pressure on TikTok from U.S. regulators. In January, the app was briefly suspended in the U.S. due to its parent company, ByteDance, failing to meet legal requirements for divesting its U.S. operations. However, a reprieve was granted by the new U.S. administration, which also hinted at a potential collaboration between TikTok and U.S. authorities.

As the social media landscape continues to evolve, the implications of these layoffs for TikTok's ability to manage content effectively remain to be seen. The platform's commitment to user safety and accurate information dissemination is now under greater scrutiny than ever.


More Quick Read Articles »