Shortcomings of the first DSA Audits — and how to do better

By Daniel Holznagel

At the end of 2024, the first audit reports under the Digital Services Act were published. Most were produced by Big Four accounting firms — and were, in many ways, not very ambitious. This post collects impressions from digesting most (not all) of these reports, focusing on five structural shortcomings that severely limit their usefulness: from illegitimate audit gaps to auditors’ apparent reluctance to interpret the law or meaningfully assess systemic risks (especially around recommender systems). The post also highlights a few useful disclosures — including platform-defined compliance benchmarks — and outlines where auditors, regulators, and civil society should push for improvements in future rounds.

Report: Pathways to Private Enforcement of the Digital Services Act

By Paddy Leerssen, Anna van Duin, Iris Toepoel, and Joris van Hoboken

Discussion of DSA enforcement tend to focus on regulatory action by the European Commission and national Digital Services Coordinators, but private actors are also taking the DSA to court. This report looks at the underexplored but important role of private enforcement—where individuals, NGOs, or consumer groups bring legal action themselves. It examines key DSA provisions with potential for such claims and outlines the legal and strategic choices that will shape how this tool is used in practice.

DSA Audits: How do platforms compare on influencer marketing disclosures?

By Taylor Annabell, Utrecht University

Under the DSA, social media platforms must provide clear tools for influencers to disclose paid content. But how well do they meet this obligation, and how rigorously is compliance assessed? This post compares eight DSA audit reports on influencer marketing disclosures under Article 26(2) and finds striking inconsistencies in how audits were conducted, what was measured, and how “compliance” was defined. The findings raise broader concerns about audit transparency, platform-defined standards, and the need for clearer guidance on what adequate disclosure—and meaningful oversight—should look like.

The DSA’s Systemic Risk Framework: Taking Stock and Looking Ahead

By Magdalena Jóźwiak, DSA Observatory

Drawing on a March 2025 workshop hosted by the DSA Observatory, this post shares reflections from researchers and civil society experts engaging with the DSA’s systemic risk framework—examining legal foundations, enforcement challenges, and the role of the research community in shaping its development.

Workshop Report: Researchers on Data Access and Preparing for DSA Article 40(4)

By John Albert and Paddy Leerssen, DSA Observatory

Drawing on a March 2025 workshop hosted by the DSA Observatory, this post shares practical and strategic insights from researchers preparing to make use of Article 40(4), from scoping proposals and navigating compliance to building collective support structures.

Making Recommender Systems Work for People: Turning the DSA’s Potential into Practice

By Alissa Cooper and Peter Chapman, Knight-Georgetown Institute

The Digital Services Act sets out broad new legal requirements to make recommender systems more transparent and accountable, including for their role in systemic risks. To fulfill that promise, implementation must go beyond basic disclosures and defaults; it must shape how these systems are designed and assessed over time. A new report from the Knight-Georgetown Institute and its accompanying EU Policy Brief offer a practical roadmap for putting these goals into action — and putting people first.

Expert insights: Fundamental rights in DSA dispute resolution procedures

By John Albert, DSA Observatory

Despite claims that it is “institutionalizing censorship,” the DSA is designed to protect fundamental rights, including freedom of expression. One key example is its provision allowing EU users to challenge platforms’ content moderation decisions through out-of-court dispute settlement (ODS) proceedings—a topic explored in depth at a recent workshop hosted by the DSA Observatory and the Article 21 Academic Advisory Board.

TikTok and the Romanian elections: A stress test for DSA enforcement

By John Albert, DSA Observatory

On December 17th, the Commission opened formal proceedings against TikTok to investigate whether the platform diligently managed electoral risks in the Romanian context. This case demonstrates two gears of regulatory action under the DSA: the slow, methodical pace of analyzing systemic risks and setting standards, and the fast, politically charged scramble to respond to high-profile crises.  

DSA risk assessment reports: A guide to the first rollout and what’s next

By John Albert, DSA Observatory

Recently published risk assessment reports may offer new insights into how the largest online platforms think about and manage risks. But the first rollout has already raised questions about publication timelines, redactions, and formats. This post aims to clarify some of these ambiguities, and points toward upcoming regulatory guidance on risk assessments and opportunities for stakeholder engagement. 

Researcher access to platform data: Experts weigh in on the Delegated Act

By John Albert, DSA Observatory

This post shares insights from a DSA Observatory workshop held on 18 November 2024, where researchers and legal experts met to discuss what’s new in the draft delegated act, what’s missing, and how to approach the Commission’s call for feedback.