Expert insights: Fundamental rights in DSA dispute resolution procedures

By John Albert, DSA Observatory

 

On 2 December 2024, the DSA Observatory and the Article 21 Academic Advisory Board hosted an online workshop to explore the intersection of fundamental rights and dispute resolution procedures under Article 21 of the DSA. This blogpost shares key themes from the discussion. 


Mark Zuckerberg sparked heated debate last week when he announced major changes to Meta’s content moderation policies and characterized the EU’s Digital Services Act (DSA) as “institutionalizing censorship.”

Such claims are wrong, as EU Commissioner Henna Virkkunen has pointed out. Rather than imposing censorship, the DSA aims to create accountability structures for platforms while empowering users with new rights to protect, among other things, freedom of expression.

One notable but often overlooked example is the DSA’s provision for EU users to challenge platforms’ content moderation decisions through out-of-court dispute settlement (ODS) proceedings. This topic was at the center of a recent workshop hosted by the DSA Observatory and the Article 21 Academic Advisory Board

With the recent certification of the first five ODS bodies, the workshop provided a timely forum for experts and representatives of ODS bodies to discuss the complexities of these new procedures—particularly if, when, and how ODS bodies should integrate fundamental rights into their decision-making processes under Article 21 of the DSA. 

Framing the discussion

Article 21 empowers users to appeal platform decisions on content moderation—such as content removal, account suspension, or access restrictions—to certified ODS bodies. 

However, while Article 21 is quiet on fundamental rights, Article 14(4) requires platforms to enforce their terms and conditions diligently, objectively, and proportionally, with due regard for these rights. This creates ambiguity about the role ODS bodies should play in fundamental rights reviews.

Two main approaches to fundamental rights review were discussed:

  • Option 1: Minimal review – ODS bodies could limit themselves to procedural oversight, avoiding the complexity and legal uncertainty of handling fundamental rights assessments. This approach would focus on whether platforms are following their general obligations, rather than getting into the nitty-gritty of specific cases.
  • Option 2: Active application – ODS bodies could adopt fundamental rights as a normative framework, using Article 14(4) and European fundamental rights law as guiding principles. This approach could clarify what “due regard” for rights means and ensure platforms apply their terms while protecting users’ rights.

If ODS bodies decide to review for fundamental rights issues, there is a further operational question of which cases should be reviewed. Three procedural options were suggested:

  1. Review all cases – Review fundamental rights in all cases because Article 14 applies to all content moderation decisions.
  2. Selective review – Limit reviews to cases where platform actions, such as content removal or demotion, significantly impact user rights.
  3. Case-by-Case assessment – Base reviews on factors such as impact on rights and case complexity, recognizing the need to consider both the person reporting content and the person creating content.

With this framing in mind, the workshop moved to an open discussion where broader pressing questions were also discussed related to the roles of platforms, ODS bodies, and their obligations.

Key Themes from the Workshop

1. A check on platforms

Participants noted that platforms often regulate speech more restrictively than states, raising questions about their obligations to users’ rights. While Article 14(4) mandates “due regard” for fundamental rights, platforms’ terms of service can be misaligned with these obligations. ODS bodies could provide an external check on whether platforms’ enforcement of terms respects fundamental rights, including freedom of expression, privacy, and non-discrimination.

2. Practical considerations

Participants addressed the practical constraints facing ODS bodies when applying fundamental rights. It was noted that these bodies are not courts, but rather somewhere between platform moderation and a court, tasked with resolving disputes relatively quickly and efficiently. Reviewing every case for fundamental rights violations may not be feasible. The discussion leaned toward prioritizing cases with significant impacts on user rights or clear rights violations while building processes that make fundamental rights assessments more cost-effective at scale.

3. Legitimacy and consistency

Users often view platform decisions as opaque or biased. By incorporating fundamental rights principles, ODS bodies could bolster their legitimacy. However, concerns were raised about symbolic use versus substantive application. And without uniform standards, there’s a risk that each ODS body interprets rights differently, leading to inconsistent outcomes and undermining confidence in the system.

4. Divergences among ODS bodies

The workshop highlighted the potential for ODS bodies to adopt varying practices due to competitive or strategic incentives. This differentiation might arise as bodies seek to attract cases or establish unique value propositions. While this flexibility could foster innovation, it risks creating fragmented interpretations of fundamental rights, complicating efforts to achieve consistency across the system. It was also noted that there is currently no formal mechanism to ensure coherence among bodies over time.  

5. Data sharing and transparency

Participants emphasized that for ODS bodies to conduct fair fundamental rights assessments, platforms need to provide them with access to the content under dispute and the context for their moderation actions. Currently, many platforms fail to do this. Without transparency and proper legal and technical arrangements for sharing data, ODS bodies risk making decisions without a full understanding of the case. It was further noted that platforms should implement adequate transparency measures for users to ensure they are informed of their rights to an ODS process.

6. A feedback loop for platforms

Participants highlighted the potential for ODS bodies to contribute to systemic change. By analyzing patterns in the cases they handle, these bodies could provide platforms with insights to refine their content moderation policies and risk assessments. This feedback loop could help platforms align more closely with fundamental rights while improving their decision-making processes.  

7. Independence and collaboration

Participants emphasized that ODS bodies need to maintain impartiality to be credible, but they also require cooperation from platforms—whether it’s sharing data or implementing recommendations. This dual role points to the need for ODS bodies to balance independence with constructive engagement.

Looking Ahead

This workshop explored the complexities of integrating fundamental rights into ODS proceedings. While there was broad agreement on their importance, participants acknowledged the challenges of operationalizing these principles. For example, the “due regard” requirement remains vague, and ODS bodies face limitations in applying it consistently across diverse cases.

Despite these challenges, ODS bodies hold significant potential to bridge the gap between platforms and users, offering a pathway to more rights-respecting content moderation. By providing both individualized remedies and systemic feedback, they contribute to the DSA’s evolving accountability framework. Building a community of practice and continuing to engage in discussions around these efforts will be essential for their success.