Waiting for the DSA’s Big Enforcement Moment

By Magdalena Jóźwiak, DSA Observatory (University of Amsterdam)   This blog post explores the issue of DSA enforcement by the European Commission, focusing on the law’s systemic risk management provisions. It first briefly sketches the Commission’s role in regulatory oversight of the systemic risk framework and then sums up enforcement efforts to date, considering also […]

Applicable Law in Out-of-Court Dispute Settlement: Three Vertigos under Article 21 of the DSA

By Lorenzo Gradoni (University of Luxembourg) and Pietro Ortolani (Radboud University Nijmegen)

Article 21 of the DSA entrusts out-of-court dispute settlement bodies with reviewing platforms’ content moderation decisions. But which law should guide them? This post examines three options: terms of service, contract law, and human rights. Each option brings challenges, inducing its own kind of Hitchcockian vertigo. A human-rights-based approach may strike a better balance, reconciling the efficiency of ODS bodies, fairness for users, and the readiness of platforms to cooperate.

Investigation: Platforms still use manipulative design despite DSA rules

By Chitra Mohanlal, Tech Researcher at Bits of Freedom

Our recent investigation into a selection of Very Large Online Platforms reveals multiple potential breaches of the DSA relating to manipulative design (Article 25), recommender system transparency (Article 27), and the obligation to offer alternative recommender systems (Article 38). The report explains and illustrates several types of manipulative design practices on platforms including Facebook, Snapchat, TikTok, Shein, Zalando and Booking.com. These findings can be used to support enforcement actions under the DSA.

What does the DSA mean for online advertising and adtech?

By Pieter Wolters & Frederik Zuiderveen Borgesius

What does the Digital Services Act (DSA) mean for online advertising and adtech (advertising technology)? This blogpost, based on a new research paper, explores that question. The most controversial insight is that ad networks and some other adtech companies must — based on an analysis of the DSA’s definitions — be considered ‘platforms’ in the sense of the DSA. Hence, they must comply with the DSA’s general rules for platforms.        

The Commission’s approach to age assurance: Do the DSA Guidelines on protecting minors online strike the right balance?

By Sophie Stalla-Bourdillon, (Brussels Privacy Hub, LSTS, VUB)

The European Commission’s guidelines on protecting minors online take important steps toward building a comprehensive list of relevant service design and organizational measures that are relevant under Article 28(1) DSA. But in doing so, they also risk oversimplifying a complex regulatory trade-off that underlies the deployment of age assurance methods. This blog post argues that the guidelines overstate the proportionality of age verification and age estimation methods, sideline key data protection concerns, and miss the opportunity to articulate the implications of a rights-based, privacy-preserving design for all users.

Shortcomings of the first DSA Audits — and how to do better

By Daniel Holznagel

At the end of 2024, the first audit reports under the Digital Services Act were published. Most were produced by Big Four accounting firms — and were, in many ways, not very ambitious. This post collects impressions from digesting most (not all) of these reports, focusing on five structural shortcomings that severely limit their usefulness: from illegitimate audit gaps to auditors’ apparent reluctance to interpret the law or meaningfully assess systemic risks (especially around recommender systems). The post also highlights a few useful disclosures — including platform-defined compliance benchmarks — and outlines where auditors, regulators, and civil society should push for improvements in future rounds.

Report: Pathways to Private Enforcement of the Digital Services Act

By Paddy Leerssen, Anna van Duin, Iris Toepoel, and Joris van Hoboken

Discussion of DSA enforcement tend to focus on regulatory action by the European Commission and national Digital Services Coordinators, but private actors are also taking the DSA to court. This report looks at the underexplored but important role of private enforcement—where individuals, NGOs, or consumer groups bring legal action themselves. It examines key DSA provisions with potential for such claims and outlines the legal and strategic choices that will shape how this tool is used in practice.

DSA Audits: How do platforms compare on influencer marketing disclosures?

By Taylor Annabell, Utrecht University

Under the DSA, social media platforms must provide clear tools for influencers to disclose paid content. But how well do they meet this obligation, and how rigorously is compliance assessed? This post compares eight DSA audit reports on influencer marketing disclosures under Article 26(2) and finds striking inconsistencies in how audits were conducted, what was measured, and how “compliance” was defined. The findings raise broader concerns about audit transparency, platform-defined standards, and the need for clearer guidance on what adequate disclosure—and meaningful oversight—should look like.

The DSA’s Systemic Risk Framework: Taking Stock and Looking Ahead

By Magdalena Jóźwiak, DSA Observatory

Drawing on a March 2025 workshop hosted by the DSA Observatory, this post shares reflections from researchers and civil society experts engaging with the DSA’s systemic risk framework—examining legal foundations, enforcement challenges, and the role of the research community in shaping its development.

Workshop Report: Researchers on Data Access and Preparing for DSA Article 40(4)

By John Albert and Paddy Leerssen, DSA Observatory

Drawing on a March 2025 workshop hosted by the DSA Observatory, this post shares practical and strategic insights from researchers preparing to make use of Article 40(4), from scoping proposals and navigating compliance to building collective support structures.