Waiting for the DSA’s Big Enforcement Moment

By Magdalena Jóźwiak, DSA Observatory (University of Amsterdam)

 

This blog post explores the issue of DSA enforcement by the European Commission, focusing on the law’s systemic risk management provisions. It first briefly sketches the Commission’s role in regulatory oversight of the systemic risk framework and then sums up enforcement efforts to date, considering also the role of geopolitics in the Commission’s enforcement calculus. 

 

Introduction 

When it comes to the high-stakes drama of DSA enforcement, the same question is on everybody’s minds: What’s taking so long?  

The European Commission has certainly been keeping busy investigating the Very Large Online Platforms (VLOPs) under its supervision. But we still have yet to see any sanctions. 

A short overview of the enforcement landscape suggests that this slow pace is shaped, not only by the complexity of the legal framework itself, but also by the broader political climate – especially transatlantic tensions during ongoing trade negotiations and the war in Ukraine – that complicates the Commission’s position. This post will dive deeper into these dynamics.  

Competences 

Some of the most innovative aspects of the DSA are the provisions setting out the systemic risk management obligations for very large online platforms and search engines (VLOPSEs)  under Articles 34 and 35 DSA. If taken seriously, these provisions cut to the core of VLOPSEs’ main activities by forcing them to account for their broad societal impact — a potential challenge to their business models.  Yet these provisions are far from clear.  

The actor at the centre of the regulatory oversight of the systemic risk framework is the Commission. The DSA enforcement framework was designed to establish a strong, centralised mechanism — one that would avoid the fragmentation and inertia seen under the GDPR. The Commission has the exclusive competence to enforce the systemic risks provisions (Article 56.2 DSA), supported by wide investigatory powers (for example, it can request information, take interviews or conduct inspections on the premises of the investigated VLO). Under Article 35.3 DSA, it can also issue official guidelines on risk mitigation measures.  Therefore, a proactive stance by the Commission is crucial for further interpretation of specific obligations resulting from the DSA’s innovative due diligence provisions.  

Enforcement to date 

The Commission’s action in the DSA territory of VLOPSEs started swiftly: with the initial designation of 17 platforms as VLOPs and two search engines as VLOSEs on 25 April 2023. Four months following the designation, the DSA systemic risks provisions became applicable to the designated services. Within several months of the first designations, in October 2023 when it already should have received the first risk assessment reports, the Commission started sending its first requests for information (one of the Commission’s investigatory tools, outlined in Article 67 DSA) to Meta (Instagram and Facebook), TikTok and X. Soon, it followed up with multiple requests for information to all but two designated VLOPSEs.  

To date, the Commission has designated 25 different VLOPSEs and sent over 70 requests for information. Information provided by the VLOPSEs led the Commission to open 14 proceedings (Article 66 DSA). The decisions to open proceedings demonstrate the Commission’s focus on social media platforms (8 open proceedings) and, recently, also porn video platforms (4 proceedings). Two proceedings were opened against online marketplaces (AliExpress and Temu).  

All these proceedings have resulted in only two closing decisions so far, where the Commission has accepted and made binding commitments from AliExpress and TikTok (pursuant to Article 71.1 of the DSA).  

While there are yet no decisions finding non-compliance with the DSA, the Commission notified its preliminary findings of breach of the DSA provisions in 6 out of the 14 ongoing proceedings (against X, Ali Express, Temu, Meta and in two proceedings against TikTok). Under Article 73.2 DSA, it is a necessary step before making a final decision of non-compliance that the Commission informs the platforms under investigation about its preliminary findings and explains what actions it plans to take or what actions the platform should take to fix the issues identified. 

Enforcement of the systemic risks framework  

According to information published by the Commission, all the opened proceedings identified potential breaches of risk management obligations. However, to date, only the proceedings against AliExpress and Temu have resulted in preliminary findings on risks. In the proceedings against social media platforms, in contrast, there have not yet been any preliminary findings related to risks. This means the Commission has, for now, avoided the difficult question of how systemic risks related to fundamental rights should be interpreted – the very issue that animates much of the discussion on systemic risks. The AliExpress and Temu cases offer little guidance on this front, focusing solely on one fundamental right: the right to a high level of consumer protection (Article 38 of the EU Charter). This right can be concretised through secondary EU law, unlike, for instance, the right to human dignity (Article 1 of the Charter), which calls for more value-laden interpretation. This suggests a cautious enforcement strategy: the Commission started by targeting low-hanging regulatory fruit.  

Where the Commission has been particularly active in recent months is the protection of minors. In May, it opened proceedings against all the porn VLOPs (Pornhub, XNXX, Stripchat, XVideos). For Pornhub (but not the others) it published in full the decision to open the proceedings, so we know specifically that the Commission suspects infringement of Article 28(1) (obligation to implement appropriate and proportionate measures to ensure a high level of safety and security of minors on Pornhub) and Articles 34(1)&(2) and 35(1) (risks connected to negative effects for the fundamental right to respect the rights of the child, the protection of minors and minors’ physical and mental well-being). Moreover, in July 2025, the Commission published comprehensive guidelines on the protection of minors.  

However, for other kinds of risks and other VLOPSEs, one might wonder if the Commission prefers to keep a low profile. 

Commentators warned that building cases against the VLOPSEs, especially in the context of the risk management framework, can take time and expertise, as it requires strong evidence that can withstand eventual scrutiny by the court, which is likely. This is surely a challenge for the Commission, which had to build its enforcement capacity almost from scratch. If competition law cases are any indicator, just look at the Google Shopping case, in which the Commission took almost seven years following its initial investigation to reach a final decision.  

Even so, the absence of any DSA sanctions to this point is striking. The Commission’s earliest preliminary findings – issued more than a year ago against X – addressed issues such as dark patterns (Article 25 – specifically the issue of how X operated its ‘verified accounts’ where anyone could simply buy a ‘blue checkmark’), advertising transparency (Article 39), and researchers’ access to publicly available data (Article 40(12)). These are relatively simple compliance questions as compared to systemic risk management. Yet no sanctions have followed. Just before the summer of 2025, X started displaying a disclaimer explaining how the ‘blue checkmark’ operates on the platform, supposedly in response to the Commission’s ongoing investigation. The Commission confirmed it took note of this change and that the investigation was still ongoing. It is unclear how this change impacted the Commission’s enforcement efforts and what the status of its investigation is.  

The slow pace of investigations has raised questions about whether the Commission is deliberately delaying enforcement, perhaps to avoid conflict with US officials who have been openly critical of the DSA during ongoing trade negotiations. 

The DSA enforcement and political climate 

Several months after the Commission announced its preliminary findings against X in July 2024, Donald Trump was elected to a second term as president. His inauguration signalled a shift in how major digital platforms positioned themselves ideologically, as many platform billionaires attended and publicly aligned with Trump. Among them was Mark Zuckerberg, the founder and largest individual shareholder of Meta, and Elon Musk, the owner of X.  

For Zuckerberg, the change of heart in his political alignment was the most remarkable, as Meta famously banned Trump’s account back in 2021 after the assault on the Capitol. In 2025, Zuckerberg posted a video on Facebook where he pledged allegiance to Trump, saying: “[w]e’re going to work with President Trump to push back on governments around the world. They’re going after American companies and pushing to censor more”. He also announced important changes to its moderation policies, for example, eliminating fact-checkers in the US and dropping rules protecting LGBTQ communities. 

Musk, meanwhile, has long been a vocal critic of the EU’s efforts to regulate online platforms. After his appointment as a White House special employee heading the Department of Government Efficiency, he continued to use familiar tropes of “censorship” and “attacks on free speech.”  

Early in Trump’s second term, similar accusations were voiced by several US officials. Vice President JD Vance warned in his 2025 speech at the Munich Security Conference against the “retreat of Europe from some of its most fundamental values.” The State Department described the DSA as “Orwellian.”  

The resulting narrative conveys a deep transatlantic divide: Europeans, it claims, are slowly sliding towards tyranny, compromising— under the pretext of fairness — freedom of speech while effectively imposing a tax on US companies. These claims were later expanded upon in the US House of Representatives staff report titled: ‘The Foreign Censorship Threat: How the European Union’s Digital Services Act Compels Global Censorship and Infringes on American Free Speech.’ 

However unfounded and politically opportunistic these claims were, they transformed EU digital regulation into a central ideological talking point for the MAGA movement – one that resonates with populist and far-right actors in Europe as well.  

Under mounting pressure from the US, many commentators feared that the EU might delay its enforcement or scrap DSA legislation altogether. In this climate, sanctioning X, owned by Trump frenemy Elon Musk, could be considered risky by the Commission in the context of the ongoing US-EU trade negotiations. The Commission repeatedly denied that the DSA could become a bargaining chip in the trade talks, and in April 2025, anonymous sources suggested that it would proceed as planned and issue sanctions against X in the summer of 2025. Yet as autumn arrives, those sanctions have still not materialised.  

One notable development that did happen over the summer was the new trade deal between the United States and the European Union. A joint statement on transatlantic trade, issued at the end of August, followed a political agreement reached in July between Donald Trump and Ursula von der Leyen. The deal included a 15% tariff on EU exports to the US. The agreement is now scrutinised as a legislative proposal in ordinary legislative procedure by the European Parliament, but the draft report by the European Parliament Committee on International Trade voiced certain criticisms about the deal.  

The joint statement did not refer explicitly to the DSA. It mentioned non-tariff barriers to trade, which the US considers the DSA to be, only in passing: “[t]he United States and the European Union commit to work together to reduce or eliminate non-tariff barriers.” It also contained a vague pledge to “address unjustified digital trade barriers.”  

Against this backdrop, it remains unclear what the trade deal means for the DSA. Recent reports suggest the trade agreement may prove short-lived: by October, the US administration had already renewed its demands for concessions on EU digital rules, including the DSA and the Digital Markets Act. 

What comes next?  

The Commission, when asked, continues to reaffirm its commitment to enforcing the DSA. This stance appears consistent with the most recent preliminary findings issued in the ongoing investigations against Meta and TikTok. Moreover, voices from across the EU – including academics, civil society, and several political leaders – have pushed back against the US narrative and urged the Commission to take a firm stance on enforcement. At the same time, in the recent interview, Commissioner Virkkunen acknowledged the delicacy of the situation, indicating that the Commission should not be “provocative” vis-à-vis the US. 

We still don’t know how the Commission’s investigations will ultimately play out.  But given where things stand, there are three interim conclusions we can already draw about the enforcement regime. 

First, a lack of transparency. It is striking that much of the information about the enforcement process has not been communicated through comprehensive reporting by the Commission, but instead through fragments and rumours circulating within the Brussels bubble. Some commentators have criticised the lack of transparency in this process: for instance, decisions to issue preliminary findings are sometimes published in full and sometimes only summarised in brief and superficial press releases, leaving gaps in the public record.  

This fragmented communication limits informed scrutiny and undermines the perceived procedural legitimacy of the DSA’s enforcement. On this note, the EU Ombudswoman Teresa Anjinho has recently found the Commission’s maladministration in a case concerning the refusal of access to X’s systemic risk reports. In this case, when asked by a journalist for access to the report before its publication became obligatory, the Commission applied a general presumption of non-disclosure. Such application of this presumption was unreasonable, according to the Ombudswoman, who recommended case-by-case assessment of each request.  

Second, control of the narrative is slipping. The Commission’s hesitation to probe systemic risks has left an informational void that VLOPs readily filled with their own (by and large self-serving) risk reports. The independent audits offered little corrective insight and often cited the Commission’s ongoing investigations as a ground for skipping the most contentious obligations. Civil society was not widely consulted by the companies in the preparation of the first reports, and its voice was largely absent. Last week, however, the European Board for Digital Services, together with the Commission, published its first report on prominent risks and best practices, drawing extensively on civil society’s work. This intervention could help shape a risk discourse grounded in public values rather than corporate bottom lines. 

Third, centralisation as a political pressure point. What was designed to be the DSA’s greatest strength – a centralised enforcement mechanism within the Commission – is, one could start to argue, becoming one of its weaknesses. The very centralisation that promised coherence and authority has also made the process more susceptible to political pressure, given the fact that the Commission is itself a political body and not an independent institution. This is true particularly in times of political instability. 

The upcoming developments around the DSA enforcement will show whether the EU can deliver on its ambitions to reshape the digital sphere in line with its stated values. For now, the ball is still in the Commission’s court.