Case IP/24/6487, TikTok – The Romanian Rumble

Can a platform sleepwalk through a geopolitical storm? The European Commission says no. In a move that will send shivers down the spine of every compliance officer in Silicon Valley, the Commission has opened formal proceedings against TikTok under the Digital Services Act (DSA). The charge? A suspected failure to stop its algorithms from being weaponized during the chaotic Romanian presidential elections, an election in which turnout hovered around 52.5% in the first round and which ultimately saw its initial results voided over manipulation concerns. If you thought the DSA was a paper tiger, think again – Brussels is baring its teeth.​

European Commission – Press release

Facts

The dispute traces back to the volatile atmosphere of the recent Romanian elections. On November 24, 2024, turnout hit 52.55%, slightly outpacing 2019 figures as 9.4mn voters went to the polls. Călin Georgescu delivered the shock. The previously marginal far-right candidate secured roughly 2.1mn votes, or 22.9% of the total. This result defied polling data. It exposed a highly unstable information ecosystem.

Intelligence now points to coordinated interference by state actors like Russia. TikTok stood directly in the crosshairs. Our analysts note the platform’s massive footprint in the region. By early 2024, it boasted 8.97mn adult users. That volume provides an advertising reach of 56.3% of adults and 49.6% of the internet user base. The app effectively reaches one in every two Romanian adults. It nears parity with Facebook, which holds a 9.05mn audience.​

Matters came to a head on December 5, 2024, when the Commission took the rare step of issuing a retention order (IP/24/6243). They essentially told TikTok: “Don’t delete anything.” The order required the freezing of data related to recommender systems and content moderation in order to preserve evidence of potential manipulation tied to the Romanian contest. TikTok was docketed to provide information on how it was mitigating these “systemic risks,” including details of its Romanian‑language moderation capacity and its detection of coordinated inauthentic behaviour. However, the answers provided—or perhaps the lack thereof – failed to satisfy the regulator, and the Commission now openly suspects that TikTok’s systems may have amplified inauthentic content, effectively pouring fuel on the electoral fire in a country of roughly 19 million people. Against that backdrop, the Commission escalated the matter from a mere inquiry to formal infringement proceedings (IP/24/6487).​

The Decision

The Commission’s opening of proceedings (IP/24/6487) is a masterclass in applying the new regulatory toolkit provided by the DSA to a live electoral crisis. Under the DSA, TikTok is classified as a Very Large Online Platform (VLOP) because it exceeds the 45 million‑monthly‑users threshold in the EU, placing it among 20‑plus services – including Facebook, Instagram and YouTube – that are subject to the regime’s most stringent obligations. The regulator identified potential breaches of Articles 34(1), 34(2), and 35(1), which require VLOPs to assess and duly mitigate systemic risks such as threats to electoral processes and disinformation, and to implement appropriate risk‑mitigation measures.​

The reasoning emanated from two main planks of law:

  • Recommender Systems & Inauthentic Behaviour: The Commission found the platform’s defense wanting regarding its “For You” feed. The core suspicion is that TikTok did not do enough to prevent “coordinated inauthentic manipulation” – for instance, botnets or state‑linked accounts gaming the recommender system to push divisive political content at scale in the run‑up to and aftermath of the November 2024 vote. Given TikTok’s ability to reach over half of Romania’s adult population, a relatively small but coordinated network can, in practice, inject narratives into the feeds of millions of voters within hours.​
  • Regional & Linguistic Blind Spots: Compounding this difficulty was the localized nature of the threat. The Commission is investigating whether TikTok failed to adequately resource its trust and safety and content‑moderation teams for the specific linguistic and cultural nuances of the Romanian market, despite the fact that Romania accounts for close to 9 million adult TikTok users and generates monthly in‑app revenues estimated in the low‑million‑dollar range by late 2024. The concern is that gaps in native‑language moderation may have left the door ajar for manipulation that a better‑resourced human team or more tailored classifier could have caught.​

It was not clear to the regulator that TikTok had “diligently mitigated” these risks. By opening formal proceedings, the Commission is signalling that it believes the platform’s safeguards were possibly unfit for purpose – and it is backing that signal with real enforcement leverage, including the possibility of fines of up to 6% of TikTok’s worldwide annual turnover under the DSA.​

Comment

This case is significant because it moves the DSA from theory to practice in the context of “hybrid warfare.” Romania’s 2024–2025 presidential saga – featuring a surprise far‑right frontrunner, allegations of massive foreign interference, an annulled first round and a later rerun in which hard‑right nationalist George Simion led the redo’s first round with around 40% of the vote – provides a stark empirical backdrop for the Commission’s concerns. It sends a chilling signal to other Very Large Online Platforms (VLOPs): you cannot simply rely on generic, global moderation policies when your recommender systems can reach tens of millions of users across the Union, and over half of all adults in a single member state.​

If your algorithm is docketed to influence a national election, you are responsible for the outcome – not in the sense of choosing winners, but in ensuring that your systems do not become an unaccountable amplifier for foreign intelligence services or coordinated domestic manipulation. Expect this to be the test case for how the EU polices the intersection of algorithmic amplification and democratic integrity, and a harbinger for similar DSA actions as more than 23 VLOPs and VLOSEs settle into a compliance regime where systemic‑risk metrics, user‑reach statistics and enforcement files like IP/24/6487 will matter as much as their quarterly earnings reports.

Commission Decision, X (formerly Twitter) – A €120 Million Invoice for a “Blue Tick” that Verifies Nothing

If a user pays for a badge that implies identity, but the platform only verifies their credit card, is that a service or a scam? The European Commission has decided it is the latter. In a landmark decision – the first of its kind under the Digital Services Act (DSA) – the Brussels executive has fined Elon Musk’s X €120 million. The charge? Deceiving users with “dark patterns” and blocking the researchers trying to figure out what is really happening on the platform. The reaction from the platform’s owner was swift and characteristic: a demand to “Abolish the EU” and the immediate retaliatory blocking of the Commission’s own advertising account.

European Commission – Press release

The Parties: The European Commission (acting as the primary enforcer of the DSA) vs. X (formerly Twitter), a designated Very Large Online Platform (VLOP) serving approximately 102 million active monthly users in the EU.

The Origin: The dispute emanated from the “chaos era” of X following its 2022 acquisition. On 18 December 2023, the Commission opened formal proceedings. After a nearly two-year investigation, the Commission concluded that X’s “compliance” was largely performative.

The Financial Breakdown:
The €120 million fine is not a lump sum but a calculated aggregate of three specific failures:

  • €45 million for the “deceptive design” of the Blue Checkmarks.
  • €40 million for failing to provide researchers with legally required data access.
  • €35 million for insufficient transparency in the advertising repository.

At the Berlaymont (The Legal Reasoning)

The Commission’s decision is built on the plank of law that “transparency is the price of market access.” The investigators were stymied by X’s refusal to align its architecture with EU norms.

1. The “Blue Checkmark” Deception (Article 25 DSA)
The Commission found that X’s interface manipulated users. Historically, the “Blue Tick” was a sign of identity verification (a status symbol for journalists, officials, and celebrities). X changed this to a paid feature ($8/month).

  • The Conflict: The Commission argued this was a “dark pattern.” By using a symbol universally associated with authenticity for a purely commercial subscription, X misled millions of users. It was not clear to the average scroller whether an account was “verified” or simply “funded.”
  • The Statistic: The Commission noted that this design choice affected 100% of the platform’s EU user base, exposing them to higher risks of impersonation scams.

2. The “Black Box” Ad Repository (Article 39 DSA)
The DSA mandates that VLOPs must maintain a searchable repository of all ads to allow civil society to monitor election interference and disinformation.

  • The Difficulty: Compounding the difficulty for regulators, X’s repository was found to be “labyrinthine.” It lacked critical metadata – specifically the identity of the legal entity paying for the ad.
  • The Finding: The Commission ruled that a repository that is technically “online” but functionally unusable (due to delays and missing fields) does not satisfy Article 39.

3. Locking Out the Watchdogs (Article 40 DSA)
Perhaps the most contentious point was X’s hostility toward academic scrutiny.

  • The Barrier: X’s Terms of Service effectively banned researchers from scraping public data. Furthermore, the API access fees were deemed “dissuasive” – priced so high that independent universities could not afford them.
  • The Consequence: By blinding the researchers, X effectively shielded itself from independent risk assessment regarding hate speech and hybrid threats.

The Penalty & Next Steps

The Fine: €120 million (approx. 4.5% of X’s estimated EU-attributable revenue, though well below the global 6% cap).

The Ultimatum:

  • 60 Days: To rectify the “deceptive” blue checkmarks.
  • 90 Days: To overhaul the ads repository and grant API access to vetted researchers.
  • The Retaliation: In a move that may trigger further “non-cooperation” fines, X deactivated the European Commission’s ad account shortly after the decision was docketed.

Comment

This is the DSA’s first real bite. It sends a chilling signal to other VLOPs (like Meta and TikTok) that “compliance” cannot be a box-ticking exercise. The breakdown of the fine reveals that the Commission places nearly equal weight on Researcher Access (€40m) as it does on Consumer Protection (€45m).

Against that backdrop, the decision sets a vital precedent: “Design” is now a legal battlefield. If your UI tricks the user, or if your API pricing is a de facto ban on scrutiny, the Commission is willing to fine you for it.

Legal Shield: scam, pyramid, or legitimate opportunity?

Amidst the evolving professional landscape, the allure of autonomy, self-determination, and the prospect of crafting a unique career trajectory has gained significant traction. Legal Shield, operating within the multi-level marketing (MLM) framework and offering legal services, remains a focal point of this discourse for several decades. The pivotal question remains: Does Legal Shield present a bona fide opportunity, or is it emblematic of the typical MLM pitfalls? This article undertakes a comprehensive examination.

Continue reading

Case C-498/16, Schrems – a Facebook consumer or simply in the business of privacy?

This case concerns a person with a Facebook account. He uses it not only to exchange private photographs and chat with about 250 friends but also for publicity purposes. The legal issue is whether this latter activity stops him from qualifying as a ‘consumer’. The definition matters because if he is a consumer, then he and several thousand other Austrians who are aggrieved at Facebook’s use of their personal data will be able to sue in the Austrian courts.

Continue reading

Case C-267/16, Buhagiar – Gibraltar, guns and the constitutional order

The Supreme Court of Gibraltar has made its first preliminary reference to the CJEU, and the burning issue is the free movement of hunters’ firearms.

Continue reading

Case C-169/15, Montis Design – EU copyright and Benelux design formalities, a game of musical chairs?

When a company owns the Benelux rights in the design of a chair but then it fails to maintain the registration of its Benelux rights under Benelux law, can a rival company still be stopped from making similar chairs because of the links between the old Benelux law and current EU law? More specifically, what is the relationship between Benelux rights and the EU’s ‘term of protection’ Directive 93/98/EEC?

Continue reading