Can a platform sleepwalk through a geopolitical storm? The European Commission says no. In a move that will send shivers down the spine of every compliance officer in Silicon Valley, the Commission has opened formal proceedings against TikTok under the Digital Services Act (DSA). The charge? A suspected failure to stop its algorithms from being weaponized during the chaotic Romanian presidential elections, an election in which turnout hovered around 52.5% in the first round and which ultimately saw its initial results voided over manipulation concerns. If you thought the DSA was a paper tiger, think again – Brussels is baring its teeth.
European Commission – Press release
Facts
The dispute traces back to the volatile atmosphere of the recent Romanian elections. On November 24, 2024, turnout hit 52.55%, slightly outpacing 2019 figures as 9.4mn voters went to the polls. Călin Georgescu delivered the shock. The previously marginal far-right candidate secured roughly 2.1mn votes, or 22.9% of the total. This result defied polling data. It exposed a highly unstable information ecosystem.
Intelligence now points to coordinated interference by state actors like Russia. TikTok stood directly in the crosshairs. Our analysts note the platform’s massive footprint in the region. By early 2024, it boasted 8.97mn adult users. That volume provides an advertising reach of 56.3% of adults and 49.6% of the internet user base. The app effectively reaches one in every two Romanian adults. It nears parity with Facebook, which holds a 9.05mn audience.
Matters came to a head on December 5, 2024, when the Commission took the rare step of issuing a retention order (IP/24/6243). They essentially told TikTok: “Don’t delete anything.” The order required the freezing of data related to recommender systems and content moderation in order to preserve evidence of potential manipulation tied to the Romanian contest. TikTok was docketed to provide information on how it was mitigating these “systemic risks,” including details of its Romanian‑language moderation capacity and its detection of coordinated inauthentic behaviour. However, the answers provided—or perhaps the lack thereof – failed to satisfy the regulator, and the Commission now openly suspects that TikTok’s systems may have amplified inauthentic content, effectively pouring fuel on the electoral fire in a country of roughly 19 million people. Against that backdrop, the Commission escalated the matter from a mere inquiry to formal infringement proceedings (IP/24/6487).
The Decision
The Commission’s opening of proceedings (IP/24/6487) is a masterclass in applying the new regulatory toolkit provided by the DSA to a live electoral crisis. Under the DSA, TikTok is classified as a Very Large Online Platform (VLOP) because it exceeds the 45 million‑monthly‑users threshold in the EU, placing it among 20‑plus services – including Facebook, Instagram and YouTube – that are subject to the regime’s most stringent obligations. The regulator identified potential breaches of Articles 34(1), 34(2), and 35(1), which require VLOPs to assess and duly mitigate systemic risks such as threats to electoral processes and disinformation, and to implement appropriate risk‑mitigation measures.
The reasoning emanated from two main planks of law:
- Recommender Systems & Inauthentic Behaviour: The Commission found the platform’s defense wanting regarding its “For You” feed. The core suspicion is that TikTok did not do enough to prevent “coordinated inauthentic manipulation” – for instance, botnets or state‑linked accounts gaming the recommender system to push divisive political content at scale in the run‑up to and aftermath of the November 2024 vote. Given TikTok’s ability to reach over half of Romania’s adult population, a relatively small but coordinated network can, in practice, inject narratives into the feeds of millions of voters within hours.
- Regional & Linguistic Blind Spots: Compounding this difficulty was the localized nature of the threat. The Commission is investigating whether TikTok failed to adequately resource its trust and safety and content‑moderation teams for the specific linguistic and cultural nuances of the Romanian market, despite the fact that Romania accounts for close to 9 million adult TikTok users and generates monthly in‑app revenues estimated in the low‑million‑dollar range by late 2024. The concern is that gaps in native‑language moderation may have left the door ajar for manipulation that a better‑resourced human team or more tailored classifier could have caught.
It was not clear to the regulator that TikTok had “diligently mitigated” these risks. By opening formal proceedings, the Commission is signalling that it believes the platform’s safeguards were possibly unfit for purpose – and it is backing that signal with real enforcement leverage, including the possibility of fines of up to 6% of TikTok’s worldwide annual turnover under the DSA.
Comment
This case is significant because it moves the DSA from theory to practice in the context of “hybrid warfare.” Romania’s 2024–2025 presidential saga – featuring a surprise far‑right frontrunner, allegations of massive foreign interference, an annulled first round and a later rerun in which hard‑right nationalist George Simion led the redo’s first round with around 40% of the vote – provides a stark empirical backdrop for the Commission’s concerns. It sends a chilling signal to other Very Large Online Platforms (VLOPs): you cannot simply rely on generic, global moderation policies when your recommender systems can reach tens of millions of users across the Union, and over half of all adults in a single member state.
If your algorithm is docketed to influence a national election, you are responsible for the outcome – not in the sense of choosing winners, but in ensuring that your systems do not become an unaccountable amplifier for foreign intelligence services or coordinated domestic manipulation. Expect this to be the test case for how the EU polices the intersection of algorithmic amplification and democratic integrity, and a harbinger for similar DSA actions as more than 23 VLOPs and VLOSEs settle into a compliance regime where systemic‑risk metrics, user‑reach statistics and enforcement files like IP/24/6487 will matter as much as their quarterly earnings reports.
