The DSA’s Systemic Risk Framework: Taking Stock and Looking Ahead
By Magdalena Jóźwiak
DSA Observatory
Drawing on a March 2025 workshop hosted by the DSA Observatory, this post shares reflections from researchers and civil society experts engaging with the DSA’s systemic risk framework—examining legal foundations, enforcement challenges, and the role of the research community in shaping its development.
On March 28, 2025, the DSA Observatory at the Institute for Information Law (IViR) hosted a full-day workshop on two interlinked aspects of the Digital Services Act: systemic risk management under Articles 34 and 35, and researcher data access under Article 40(4).
Organized as part of the DSA Research Network and supported by Stiftung Mercator, the workshop brought together academic and civil society researchers, legal scholars, and policy experts to discuss the DSA’s transparency and accountability framework. Participants split into two tracks to examine both the practical challenges of implementing and enforcing the DSA and the broader political context in which it is unfolding.
This blog post focuses on the systemic risks track. A companion post explores researcher access under Article 40(4), drawing on the parallel track of the same workshop.
The context of the workshop
The DSA introduced some genuine regulatory firsts. Among the most significant is the legal requirement for Very Large Online Online Platforms and Search Engines (shortly VLOs) to manage systemic risks linked to how their services are designed, operated, and used.
This framework raises considerable questions – not only due to its legal and technical complexity, but also because of the political context surrounding its enforcement.
With key operational guidance still forthcoming – including a best practices report from the European Board for Digital Services (Art. 35(2) DSA) and a Commission-funded study by Open Evidence – this moment represents a critical juncture in the DSA’s implementation.
Against this backdrop , the workshop explored three interlinked themes: (1) the DSA’s legal framework for systemic risk management, (2) the enforcement challenges ahead, and (3) the role of researchers and civil society in shaping its future.
Workshop participants raised many open questions about the systemic risk framework, including how risks are defined, interpreted, and operationalized by different actors. Yet one point stood out clearly: spaces for sustained, in-depth discussion – like this workshop – are indispensable for advancing understanding and collaboration in this evolving field.
Discussion Topics: Politics and DSA enforcement
The opening plenary session made clear that the DSA is not just a regulatory instrument but a key piece in a broader geopolitical puzzle – a factor that significantly shapes both platform compliance and the Commission’s enforcement strategies.
One major concern among participants was the return of the Trump administration in the US. This impact is already being felt, with major US-based VLOs like Meta and X actively contesting their DSA obligations and appearing to align themselves more closely with the new administration. Notably, Meta CEO Mark Zuckerberg shifted his stance towards content moderation, announcing significant changes to Meta’s policies like ending partnerships with fact-checkers in the US.
These changes suggest a strategic pivot – one that could embolden these companies to resist the Commission’s future enforcement actions (a dynamic which may play out in their attitudes to DSA compliance, including the risk assessments under Article 34-35).
The political context also raises questions about the role of the Commission as regulator and national Digital Services Coordinators (DSCs). As a political institution, the Commission’s enforcement strategy may be shaped by both internal and external pressures. Former Commissioner Thierry Breton’s controversial attempts to publicly ‘jawbone‘ platforms into specific content moderation choices illustrated how political interference can blur regulatory boundaries.
At the national level, the political leanings of individual DSCs may also come into play. Some DSCs, like the Irish authority, are expected to be particularly influential, given many VLOs are based in Ireland.
In this context, it was put forward that researchers and civil society actors have a crucial responsibility: to stay vigilant and uphold a principled, independent approach to monitoring and engaging with the DSA’s enforcement.
From metrics to meaning
As experts following the DSA debate know, the concept of systemic risks remains elusive and poorly defined.
While the workshop did not offer definitive answers, participants shared valuable insights on how the term should be interpreted moving forward. One suggestion was a ‘dual track‘ approach: keeping the concept broad when it comes to data access requests (in view of the function of Article 40), while applying a more narrowly defined, benchmark-based framework for the risk assessment requirement under Article 34.
This approach could support transparent compliance, but it raises further questions: How to condense broad risk categories into meaningful benchmarks? And would narrowing the definition of systemic risks too early limit the DSA’s potential? Many argued that a broad, relatively open interpretation of systemic risks is preferable to preserve flexibility and adaptability, and appropriate in view of the DSA’s goals.
The discussion further explored the ‘systemic risk’ concept from two perspectives: its legal meaning and its societal embeddedness.
Legally, a key concern is the normative weight of VLOs’ obligation to identify, assess, and mitigate systemic risks. Several participants called for deeper conceptual engagement with the term. Platform-provided data provided can help inform our understanding, yet it’s equally necessary to have a robust conceptual framework – it’s not just about metrics, but meaning.
Systemic risk, according to whom?
Much of the debate centered on who shapes how systemic risks get interpreted.
At present, specific definitions may emerge from the interpretations of platforms, the EC and DSCs, or auditors – each with their own logics and incentives. Simply put, and within legal boundaries, a systemic risk can be whatever any of these stakeholders say it is. The risk of capture by a single stakeholder is real, and early reductive interpretations could create long-term path dependencies, undermining the DSA’s transformative ambition.
How did platforms define risks in their risk assessment reports published in November 2024? Reflections on the reports showed that platforms generally treated the four risk categories in Article 34 as a starting point, but applied diverse and inconsistent methodologies in their assessments – with little effort to define systemic risks as such.
It’s also unclear whether risk analysis should be performed in a cross-platform manner, taking into account the broader ecosystem of platforms’ services. While the VLOs did not undertake this particular exercise, the independent study by the Commission on ‘Application of the risk management framework to Russian disinformation campaigns’ suggests that such an approach would be appropriate for proper risk assessment.
Legal clarity and future interpretations
Some participants emphasized the potential role of the Court of Justice of the European Union (CJEU) in providing legal clarity. The concept of systemic risks could evolve into an ‘autonomous concept of EU law’, with jurisprudence gradually shaping its meaning. Yet others pointed out the CJEU’s tendency toward vagueness – especially in interpreting fundamental rights, such as freedom of expression – which could limit its potential to provide the necessary clarity.
Another challenge before the court will be conceptualizing the systemic risks to fundamental rights ex ante – looking at potential and collective harms, adequate for the risk assessment framework – which could mean a departure from the orthodoxy of ex post, case-by-case evaluations of fundamental rights infringements.
An alternative path to clarity lies in the normative commitments embedded in the DSA itself and in the delegated acts adopted to date. As one of the participants pointed out, these texts suggest that the legislator intended for the concept of systemic risks to be defined through inclusive, participatory processes – engaging civil society and acknowledging the fact that technology and its harms are inseparable from their social contexts.
This perspective calls not only for broader engagement but also for context-sensitive methodologies capable of capturing how risks actually emerge and are experienced.
Context-aware methods for risk analysis
Workshop participants stressed the need for methodologies that account for how technologies operate within society. Systemic risks are not inherent to particular technologies, but emerge through their application — meaning that social, cultural, and political dynamics play a role in determining whether an aspect of a particular service is ‘risky’.
In this light, some participants proposed innovative qualitative methods to complement quantitative data. One approach put forward during the workshop involves projecting future scenarios to imagine how technologies might cause harm – helping surface risks that are difficult to foresee or quantify using existing data. This method grounds risk assessments in concrete social and technical contexts, encouraging more inclusive and forward-looking reflections on potential impacts.
All eyes on the Commission
Multiple speakers emphasised the central role of the Commission in ensuring VLOs’ accountability.
The Commission faces a steep learning curve as it steps into its new role as regulator in the DSA context, setting enforcement priorities and launching investigations. While participants voiced several critiques of its performance so far, many also acknowledged that – given the tight timelines, the need to build capacity almost from scratch, and the scale of the enforcement task – the Commission has managed to avoid missteps in its early enforcement efforts.
Despite the limited transparency around its enforcement actions, some emerging enforcement priorities could be identified – particularly in relation to illegal content, the protection of minors, and election integrity. Rather than rushing to impose sanctions, the Commission appears, for now, more focused on shifting platform behaviour. This may reflect a cautious strategy: as some scholars noted, fines in the contentious area of systemic risks would likely face strong legal challenges from VLOs and could be difficult to substantiate in court with the data currently available.
At the same time, the Commission has shown openness to learn, frequently reaching out to academia and civil society for relevant input. While this engagement is valuable, some voices cautioned against researchers becoming an extension of the regulatory apparatus. The role of research should remain critical and independent – especially given the Commission’s inherently political character – and be driven by broader goals of knowledge production that support a wider range of stakeholders.
Ultimately, despite the Commission’s important role, its overall enforcement approach remains difficult to assess at this stage due to a lack of transparency.
DSA enforcement opacity: a bug or a feature?
Participants took this lack of transparency as a starting point to reflect on deeper structural issues. One recurring concern was that the Commission might be engaging in political bargaining behind closed doors or maintaining informal backchannels with platforms – arrangements whose outcomes are not made public. Without access to these processes, researchers and civil society actors are left in the dark, which undermines public accountability.
There is also limited clarity on what drives the Commission’s enforcement priorities. While it regularly updates its website with information about ongoing actions, these updates often take the form of short press releases rather than detailed, reasoned decisions.
At the same time, some participants suggested that this opacity may be a deliberate and, to some extent, understandable strategy. In such a politicized environment, revealing too much too soon could jeopardize carefully built enforcement cases – a pattern seen in other areas of law, such as competition enforcement.
Legitimacy through participation
Still, there was broad agreement that systemic risks under the DSA are not purely legal or technical matters. They are deeply embedded in social contexts and require regular input from a wide range of stakeholders – something that is expressly anticipated in the DSA’s text.
That’s why, even if full transparency isn’t always possible, it’s important for the Commission to find ways to communicate how external input is taken into account in its compliance efforts. Doing so is essential not only for upholding the fundamental rights and democratic values at stake, but also for reinforcing the DSA’s overall legitimacy.
A call to build collaborations
One of the central issues that reverberated throughout the workshop was the need to build inclusive collaborations across stakeholder groups.
Making the DSA’s systemic risk assessments meaningful will depend on assembling a wide base of knowledge. Platforms, for their part, should make a genuine effort to reach diverse communities in their public consultations – something currently only vaguely referenced in the risk assessment reports. The Commission, too, should aim to build broader research collaborations, reaching beyond the usual suspects in the Brussels bubble to better understand how the systemic risk framework can be operationalized.
This also raises the question about the role of academia in this broader ecosystem. Several participants stressed that academia must take a proactive role in facilitating civil society inclusion and ensuring that the voices of marginalized communities are heard.
This would also involve a mapping out of the wider DSA ecosystem: who’s involved, who’s left out, who drives the conversation, how information flows, and how the process could be rebalanced to reflect a broader, more representative range of insights.
Closing remarks
An important thread in the discussion was the tension in how the DSA systemic risk obligations should be interpreted – narrowly or broadly – and how these interpretative choices carry significant consequences for implementation. The way the rules are operationalized can lead to new problems, especially in a context where no single actor is clearly orchestrating the whole process.
For their part, VLOs – aside from occasional complaints about compliance costs – seem largely unfazed by the risk assessment obligations and generally appear to opt for paths of least resistance. The Commission, meanwhile, faces major challenges relating to capacity, experience, stakeholder relations management, enforcement efficacy, transparency, and a concentrated auditing market. Feedback loops to identify what is working – and what isn’t – are only just starting to develop.
To address these shortcomings, several participants highlighted the importance of meta-research – systematic investigation into where and why the system might be failing. Researchers and civil society have a vital role to play in this as an ‘epistemic counterpower’, capable of exposing gaps and holding institutions to account. Yet such processes are easily undermined by a lack of inclusivity, with some voices privileged while others are left out.
For researchers to fulfil their envisioned role, they need better access to information pipelines, greater transparency from the European Commission, functioning feedback mechanisms, and stronger guarantees of academic independence and safety – especially in today’s politically charged climate.
A key question raised in the closing session encapsulates the broader dilemma: How do we help develop to the DSA’s risk assessment framework while remaining critical of it?
It may still be too early to say whether the systemic risks framework will become a meaningful tool for platform accountability. But we should be careful not to throw the DSA baby out with the bathwater.
At the same time, it is crucial to remain critically engaged, particularly as the DSA becomes part of a broader regulatory landscape shaped by deeper ideological and value-based tensions within democratic societies.