Investigation: Platforms still use manipulative design despite DSA rules

By Chitra Mohanlal, Tech Researcher at Bits of Freedom

Our recent investigation into a selection of Very Large Online Platforms reveals multiple potential breaches of the DSA relating to manipulative design (Article 25), recommender system transparency (Article 27), and the obligation to offer alternative recommender systems (Article 38). The report explains and illustrates several types of manipulative design practices on platforms including Facebook, Snapchat, TikTok, Shein, Zalando and Booking.com. These findings can be used to support enforcement actions under the DSA.

What does the DSA mean for online advertising and adtech?

By Pieter Wolters & Frederik Zuiderveen Borgesius

What does the Digital Services Act (DSA) mean for online advertising and adtech (advertising technology)? This blogpost, based on a new research paper, explores that question. The most controversial insight is that ad networks and some other adtech companies must — based on an analysis of the DSA’s definitions — be considered ‘platforms’ in the sense of the DSA. Hence, they must comply with the DSA’s general rules for platforms.        

The Commission’s approach to age assurance: Do the DSA Guidelines on protecting minors online strike the right balance?

By Sophie Stalla-Bourdillon, (Brussels Privacy Hub, LSTS, VUB)

The European Commission’s guidelines on protecting minors online take important steps toward building a comprehensive list of relevant service design and organizational measures that are relevant under Article 28(1) DSA. But in doing so, they also risk oversimplifying a complex regulatory trade-off that underlies the deployment of age assurance methods. This blog post argues that the guidelines overstate the proportionality of age verification and age estimation methods, sideline key data protection concerns, and miss the opportunity to articulate the implications of a rights-based, privacy-preserving design for all users.

Shortcomings of the first DSA Audits — and how to do better

By Daniel Holznagel

At the end of 2024, the first audit reports under the Digital Services Act were published. Most were produced by Big Four accounting firms — and were, in many ways, not very ambitious. This post collects impressions from digesting most (not all) of these reports, focusing on five structural shortcomings that severely limit their usefulness: from illegitimate audit gaps to auditors’ apparent reluctance to interpret the law or meaningfully assess systemic risks (especially around recommender systems). The post also highlights a few useful disclosures — including platform-defined compliance benchmarks — and outlines where auditors, regulators, and civil society should push for improvements in future rounds.

Report: Pathways to Private Enforcement of the Digital Services Act

By Paddy Leerssen, Anna van Duin, Iris Toepoel, and Joris van Hoboken

Discussion of DSA enforcement tend to focus on regulatory action by the European Commission and national Digital Services Coordinators, but private actors are also taking the DSA to court. This report looks at the underexplored but important role of private enforcement—where individuals, NGOs, or consumer groups bring legal action themselves. It examines key DSA provisions with potential for such claims and outlines the legal and strategic choices that will shape how this tool is used in practice.

DSA Audits: How do platforms compare on influencer marketing disclosures?

By Taylor Annabell, Utrecht University

Under the DSA, social media platforms must provide clear tools for influencers to disclose paid content. But how well do they meet this obligation, and how rigorously is compliance assessed? This post compares eight DSA audit reports on influencer marketing disclosures under Article 26(2) and finds striking inconsistencies in how audits were conducted, what was measured, and how “compliance” was defined. The findings raise broader concerns about audit transparency, platform-defined standards, and the need for clearer guidance on what adequate disclosure—and meaningful oversight—should look like.

The DSA’s Systemic Risk Framework: Taking Stock and Looking Ahead

By Magdalena Jóźwiak, DSA Observatory

Drawing on a March 2025 workshop hosted by the DSA Observatory, this post shares reflections from researchers and civil society experts engaging with the DSA’s systemic risk framework—examining legal foundations, enforcement challenges, and the role of the research community in shaping its development.

Workshop Report: Researchers on Data Access and Preparing for DSA Article 40(4)

By John Albert and Paddy Leerssen, DSA Observatory

Drawing on a March 2025 workshop hosted by the DSA Observatory, this post shares practical and strategic insights from researchers preparing to make use of Article 40(4), from scoping proposals and navigating compliance to building collective support structures.

Making Recommender Systems Work for People: Turning the DSA’s Potential into Practice

By Alissa Cooper and Peter Chapman, Knight-Georgetown Institute

The Digital Services Act sets out broad new legal requirements to make recommender systems more transparent and accountable, including for their role in systemic risks. To fulfill that promise, implementation must go beyond basic disclosures and defaults; it must shape how these systems are designed and assessed over time. A new report from the Knight-Georgetown Institute and its accompanying EU Policy Brief offer a practical roadmap for putting these goals into action — and putting people first.

Expert insights: Fundamental rights in DSA dispute resolution procedures

By John Albert, DSA Observatory

Despite claims that it is “institutionalizing censorship,” the DSA is designed to protect fundamental rights, including freedom of expression. One key example is its provision allowing EU users to challenge platforms’ content moderation decisions through out-of-court dispute settlement (ODS) proceedings—a topic explored in depth at a recent workshop hosted by the DSA Observatory and the Article 21 Academic Advisory Board.