TikTok and the Romanian elections: A stress test for DSA enforcement
by John Albert, DSA Observatory
20 December 2024
On December 17th, the Commission opened formal proceedings against TikTok to investigate whether the platform diligently managed electoral risks in the Romanian context. This case demonstrates two gears of regulatory action under the DSA: the slow, methodical pace of analyzing systemic risks and setting standards, and the fast, politically charged scramble to respond to high-profile crises.
Election, Interrupted
On 6 December, Romania’s constitutional court unanimously voted to annul the results of the first round of the election, based in part on Romanian intelligences services findings—which have since been declassified—that Russian influence operations manipulated social media to promote the independent nationalist candidate Călin Georgescu in violation of the country’s electoral integrity laws.
The Romanian court’s controversial decision was based on Romanian electoral law. But Georgescu’s sudden rise from fringe candidate to frontrunner is also an early stress test for the DSA, putting major online platforms—especially TikTok—under acute regulatory scrutiny for how they manage risks related to electoral manipulation. EU and national authorities also face renewed pressure to step up their cooperation and monitoring efforts ahead of upcoming elections.
TikTok under investigation
Romanians went to the polls on 24 November; the Commission quickly escalated enforcement actions against TikTok over the weeks that followed. A formal request for information on 29 November was followed by a “retention order” on 5 December, culminating on 17 December when the Commission opened formal proceedings against the platform under the DSA.
The crux of the Commission’s investigation is whether TikTok diligently managed risks related to elections and civic discourse, especially linked to:
- TikTok’s recommender systems (and risks stemming from coordinated inauthentic manipulation or automated exploitation of the platform)
- TikTok’s policies on political advertisements and paid-for political content (given evidence of paid Romanian influencers circumventing TikTok’s prohibition on political ads).
The Commission will also investigate whether TikTok’s risk mitigations are sufficient with regard to specific regional and linguistic aspects of national elections.
How do we evaluate TikTok’s management of electoral risks?
Evidence raised against TikTok so far, including from third parties like EDMO, does point to shortcomings in the company’s risk management efforts along the lines of the Commission’s investigation. And an independent experiment by the NGO Global Witness confirmed that the TikTok’s “For You” feed gave significant preferential treatment to Georgescu over his rival candidate, Elena Lasconi—though it’s an open question how much this disparity was influenced by malign actors rather than driven by user behavior.
The question for DSA watchers is: What does it mean for a platform to diligently identify and mitigate election-related risks, and where can the Commission legally draw the line on violations?
TikTok hasn’t been sitting on their hands—the platform released a long statement on 6 December (updated 17 December) detailing its election integrity efforts in Romania, which include disrupting several small covert influence networks. In its recently released 2023 systemic risk assessment report, the company declares election misinformation a “Tier 1” priority and outlines a dynamic mitigation approach.
But we shouldn’t take TikTok’s word for it. What does the company’s auditor, KPMG, say about the company’s risk-management efforts? They lack sufficient information to form an opinion, according to the recently published audit report of TikTok’s still undisclosed 2024 risk assessment. KPMG cites the Commission’s ongoing formal proceedings against TikTok, saying it can’t complete the audit without documentation “explaining the reasons for and/or benchmarks/criteria underlying the formal proceedings.”
Setting benchmarks for risk management has been a DSA hot potato—and the Commission appears to be holding it now, adding the latest TikTok proceedings to a spate of open investigations.
Systemic failures go beyond TikTok
The Commission’s decision to target TikTok so prominently—when evidence points to systemic failures across the board—underscores how political decisions can shape enforcement priorities. TikTok’s massive popularity in Romania puts it at the center of the current controversy, but it’s clear that the systemic risks raised in this case extend beyond a single platform.
If we just take the issue of political ads, for example, we know that TikTok is not alone in systematically failing to enforce its own policies—even in Romania. An investigation by CheckFirst shows similar coordinated influence campaigns that targeted Romanians using Meta platforms—repeatedly violating the company’s ad policies—with evidence of cross-platform coordination, including through Google Ads.
We also know that Telegram played a key role in coordinating Georgescu’s campaign. A DFR Lab investigation shows how Telegram operated as a command hub, distributing content to thousands of “volunteers” along with instructions to personalize the distributed materials before posting—a strategy that helped bypass algorithmic filters from flagging content as spam or repetitive.
This kind of political coordination is not necessarily illegal. But it shows how increasingly sophisticated online tactics can circumvent internal platform policies and potentially undermine the DSA—especially given that Telegram, despite hosting channels with many thousands of followers, is not currently subject to the same risk management obligations and oversight as designated “Very Large Online Platforms” (VLOPs) like TikTok and Facebook.
So did the DSA fail in Romania?
On 26 November, an open letter signed by 21 Romanian civil society organizations criticized not only TikTok but both the European Commission and ANCOM, Romania’s Digital Services Coordinator, for failing to proactively or transparently engage with platforms during a critical electoral moment. They also condemned national authorities for disregarding the DSA in their haphazard efforts to restrict online speech, saying this eroded public trust in the rule of law.
The DSA enforcement apparatus is still getting up to speed—it feels premature to say that it failed in this instance. We also can’t know if more robust or proactive measures would have changed the election result. But as 2024 comes to an end, we’re seeing two gears of regulatory action: the slow, methodical pace of analyzing systemic risks and setting standards, and the fast, politically charged scramble to respond to high-profile crises.
New Year’s resolutions for DSA enforcement
The Romanian case underscores urgent lessons we shouldn’t ignore in 2025, from paid influencers dodging platforms’ ad rules to political actors using cross-platform coordination and tactics to game algorithmic recommender systems and evade detection. Platforms are now on notice, but so too are the institutions tasked with enforcing the rules.