Technology & Innovation

TikTok used AI in nearly 100% of violating-content moderation in Europe

TikTok removed 112 million pieces of content – mainly with automated systems – that violated the platform’s policies in the EU between July and December 2025, according to the platform’s latest Digital Services Act transparency report.

  • Owen Carpenter-Zehe
  • March 2, 2026
  • 0 Comments

TikTok removed 112 million pieces of content — mainly with automated systems — which violated the platform’s policies in the EU between July and December 2025, according to the Chinese social media giant’s latest transparency report. 

The controversial platform published its sixth transparency report, as required by the Digital Services Act, on Friday (27 February). The document outlines the platform’s content moderation strategy and actions taken.

TikTok said it was improving and relying on its automated moderation systems for the removals, which include artificial intelligence tools.

The social media platform said it was using AI technology to increase the speed of detection and better protect its moderators from distressing content. It is also used to “help make it easier to moderate nuanced areas like misinformation.”

TikTok reveals the automated AI-tools, over fourth-quarter of 2025, “actioned 93.8 percent of all violating content without human review… with 97 percent of automated enforcement decisions being confirmed as correct.”

The new report comes amid intense scrutiny of the social media platform across the EU, as European regulators are increasingly examining its design and content. 

For example, the European Commission preliminarily found in February that the service was in breach of the bloc’s addictive design rules, and in December last year, the platform agreed to resolve a commission complaint by increasing its advertising transparency.

There are also growing initiatives across the continent to ban social media for minors, due to, in part, kids potential exposure to harmful content.  

And on the human side of moderations, the report discloses that TikTok employs 91 people, and contracts 3,583 human moderators to support the automated systems.

TikTok claims that it removed 99.3 percent of violating content before it was reported. 

But EU law under the Digital Service Act (DSA) requires services to create methods to let users report illegal content, receiving 714,000 user-reported incidents within the Q4 timeframe.

Member state authorities can also contact TikTok to remove illegal content, and the platform received 782 requests during this period.

France and Romania were the EU members with the most removal requests. 

TikTok reported hosting 187 million monthly EU users.

This post was originally published on this site.