Who speaks and who is heard? Civil society participation and participatory justice in DSA systemic risk management

By Mateus Correia de Carvalho (PhD researcher, EUI) and Rachel Griffin (postdoctoral researcher, Duisburg-Essen University)

This post presents the report ‘Who speaks and who is heard?’ on civil society participation and participatory justice in DSA systemic risk management. In it, we examine the early practice of DSA participation to understand how it is unfolding in practice. Drawing on a qualitative empirical study with civil society actors of diverse backgrounds, perspectives, and activities, we traced the mechanisms used by those actors, their perceptions about the usefulness of different spaces of participation, the strategic considerations determining when, where, whether, and how to participate, as well as the obstacles and inequalities hampering meaningful participation. Regarding these inequalities, we propose to understand them as a matter of participatory justice, making some recommendations to improve the overall inclusiveness of DSA participation in systemic risk management. 

Reclaiming the Algorithm: What the DSA can—and can’t—fix about recommender systems

By Katarzyna Szymielewicz, Panoptykon

Europe’s information environment has a structural vulnerability so long as dominant platforms continue to optimise their recommender systems for engagement rather than democratic resilience. This piece examines how the DSA can be used to push platforms toward algorithms that better serve the public interest—through systemic-risk mitigation, design obligations, and enforcement—and what meaningful recommender-system interventions could look like in practice.

The Missing Metrics in DSA Content Moderation Transparency

By Max Davy, Oxford Internet Institute

The Digital Services Act makes platform transparency reporting mandatory and standardised, but the metrics it requires still fall short of what is needed for real accountability. Counts of removals and appeals alone cannot tell us whether content moderation systems are accurate, proportionate, or effective, making the absence of evaluation metrics such as precision and recall increasingly difficult to justify under the DSA’s risk-based logic. While these metrics are unlikely to surface in baseline transparency reports under Articles 15 and 24, the post argues they may yet emerge through heightened scrutiny of the largest online platforms and search engines (VLOPSEs), as regulatory expectations take shape through enforcement, systemic risk reporting, audits, and related obligations.

What are DSA audits doing for systemic risk enforcement? The case of X

By John Albert, DSA Observatory

Of the nineteen service providers initially designated as VLOPSEs under the DSA, X’s first compliance audit stands apart. Its auditor, FTI Consulting, broke from industry peers by offering relatively critical opinions — including findings that were unfavourable to the platform on obligations under active Commission investigation. How did X respond? Rather than work toward implementing the auditor’s recommendations, X simply reshuffled the deck: it went out and hired a new auditor (BDO). The move raises a deeper question about what the DSA audit regime is actually doing — and how seriously the Commission treats audits as part of systemic-risk enforcement, which, in principle, relies on auditors to provide an additional, independent layer of scrutiny. 

Waiting for the DSA’s Big Enforcement Moment

By Magdalena Jóźwiak, DSA Observatory (University of Amsterdam)   This blog post explores the issue of DSA enforcement by the European Commission, focusing on the law’s systemic risk management provisions. It first briefly sketches the Commission’s role in regulatory oversight of the systemic risk framework and then sums up enforcement efforts to date, considering also […]

Applicable Law in Out-of-Court Dispute Settlement: Three Vertigos under Article 21 of the DSA

By Lorenzo Gradoni (University of Luxembourg) and Pietro Ortolani (Radboud University Nijmegen)

Article 21 of the DSA entrusts out-of-court dispute settlement bodies with reviewing platforms’ content moderation decisions. But which law should guide them? This post examines three options: terms of service, contract law, and human rights. Each option brings challenges, inducing its own kind of Hitchcockian vertigo. A human-rights-based approach may strike a better balance, reconciling the efficiency of ODS bodies, fairness for users, and the readiness of platforms to cooperate.

Investigation: Platforms still use manipulative design despite DSA rules

By Chitra Mohanlal, Tech Researcher at Bits of Freedom

Our recent investigation into a selection of Very Large Online Platforms reveals multiple potential breaches of the DSA relating to manipulative design (Article 25), recommender system transparency (Article 27), and the obligation to offer alternative recommender systems (Article 38). The report explains and illustrates several types of manipulative design practices on platforms including Facebook, Snapchat, TikTok, Shein, Zalando and Booking.com. These findings can be used to support enforcement actions under the DSA.

What does the DSA mean for online advertising and adtech?

By Pieter Wolters & Frederik Zuiderveen Borgesius

What does the Digital Services Act (DSA) mean for online advertising and adtech (advertising technology)? This blogpost, based on a new research paper, explores that question. The most controversial insight is that ad networks and some other adtech companies must — based on an analysis of the DSA’s definitions — be considered ‘platforms’ in the sense of the DSA. Hence, they must comply with the DSA’s general rules for platforms.        

The Commission’s approach to age assurance: Do the DSA Guidelines on protecting minors online strike the right balance?

By Sophie Stalla-Bourdillon, (Brussels Privacy Hub, LSTS, VUB)

The European Commission’s guidelines on protecting minors online take important steps toward building a comprehensive list of relevant service design and organizational measures that are relevant under Article 28(1) DSA. But in doing so, they also risk oversimplifying a complex regulatory trade-off that underlies the deployment of age assurance methods. This blog post argues that the guidelines overstate the proportionality of age verification and age estimation methods, sideline key data protection concerns, and miss the opportunity to articulate the implications of a rights-based, privacy-preserving design for all users.

Shortcomings of the first DSA Audits — and how to do better

By Daniel Holznagel

At the end of 2024, the first audit reports under the Digital Services Act were published. Most were produced by Big Four accounting firms — and were, in many ways, not very ambitious. This post collects impressions from digesting most (not all) of these reports, focusing on five structural shortcomings that severely limit their usefulness: from illegitimate audit gaps to auditors’ apparent reluctance to interpret the law or meaningfully assess systemic risks (especially around recommender systems). The post also highlights a few useful disclosures — including platform-defined compliance benchmarks — and outlines where auditors, regulators, and civil society should push for improvements in future rounds.