Practical Considerations for Out-of-Court Dispute Settlement (ODS) under Article 21 of the EU Digital Services Act (DSA)

Thomas Hughes

Thomas Hughes is the Director of the Oversight Board Administration. This article is written in a personal capacity and does not necessarily reflect the views of the Oversight Board.

 

Out-of-court dispute settlement bodies (ODS) have the potential to be an important pillar of the EU Digital Services Act (DSA). As proposed in this piece, consistency, transparency and appropriate standards regarding “engage[ment], in good faith”, managing abuse, expertise, accessibility, data protection and reporting would help ensure ODS are an effective route to redress for users, as well as a valuable source of data for improving content governance in the EU and beyond.

I. Introduction

The Digital Services Act (DSA) marks a turning point in how online platforms treat users. Article 21 of the Act provides for the creation of out-of-court dispute settlement bodies (ODS). These bodies will be certified by Digital Services Coordinators (DSCs) to resolve disputes relating to the content enforcement decisions of online platforms, providing effective, timely and independent redress.

The successful realisation of this new Article 21 landscape will bring meaningful value to people and organisations, empowering them to challenge decisions taken by companies about content on their platforms, thereby pursuing their rights and influencing their immediate and wider information environments. But the impact of these bodies also has the potential to exceed the resolution of individual disputes. By requiring ODS bodies to report on their activities, Article 21 will also create a source of data that will help identify systemic risks and harms. This regulatory feedback loop could strengthen the overall framework of the DSA, allowing regulators to better target their interventions and platforms to put in place mitigation measures, with particular consideration to the impacts of such measures on human rights.

At the same time, the details of Article 21’s implementation are yet to be fully defined. What is required for ODS bodies to be certified, and the nature of platforms’ and users’ (including people and organisations) obligations to “engage, in good faith” with the ODS (Article 21(2)), are established at a high level in the DSA. DSCs have their work cut out in determining how to apply these requirements, whilst ODS bodies should ensure they are working in a coherent, consistent and transparent manner in order for Article 21 to be a success.

Approaching Article 21 without clear standards in mind would be a missed opportunity to better understand how platforms operate, and to ultimately improve them. It could also lead to unwanted results. Differing approaches to assessing ODS expertise (Article 21(3)(b)), cost-effectiveness and efficiency (Article 21(3)(e), and differing standards of “engage[ment], in good faith” risk creating inconsistency for users and unpredictability for platforms and regulators. Article 21 bodies operating based on materially different standards will also report out data that is hard to analyse or compare. The value of Article 21, both as a route to redress for users and as a mechanism for feeding back into the DSA’s wider regulatory aims, could be undermined. If the terms and conditions of platforms are interpreted in significantly different ways, it will increase the risk of forum shopping for desired outcomes, resulting in a chaotic ecosystem of bodies that fail to bring much needed clarity to an already complex space.

With that in mind, for Article 21 to be a success, a realistic and practical approach will be required based on the standardized interpretation and application of several Article 21 requirements.

II. Requirement to engage in good faith

For any ODS to settle disputes in a swift, efficient, and cost-effective manner, the ODS will require “engage[ment], in good faith” from the platform(s) within its scope. This engagement will need to consist, at the minimum, of the following elements:

  • The capacity to receive notice of a dispute from an ODS and identify the content enforcement decision that is the subject of the dispute;
  • The timely provision of certain essential information to the ODS to allow the dispute to be determined as eligible for review and resolved within the timeframes specified in Article 21, in particular the content subject to the dispute and the platform’s prior reasons for making the content enforcement decision that is the subject of the dispute; and
  • The capacity to receive the non-binding decision of the ODS and to determine whether to implement it. Although there is no requirement for platforms to implement decisions, where the standards set out here are met, it should be in the platforms interests to comply.

In relation to the content that is the subject of the appeal, it may not be possible for all users to provide this content. This is because the platform’s enforcement action that is subject to appeal may have included removing the content in question. As such, platforms (rather than users) should, as part of “engag[ing], in good faith” with ODS, provide the content subject to appeal to ODS, which are reviewing disputes relating to that content. While Article 21 does not specify a time period for keeping such content, by reference to Article 20(1), at least six (6) months following the enforcement action appears to be a suitable period. Moreover, platforms should provide the relevant content to ODS without delay once an eligible appeal is submitted.

In relation to the platform’s prior reasons for making the enforcement decision that is the subject of the appeal, Article 17(1) of the DSA requires platforms to provide users with a “clear and specific statement of reasons” for any restriction imposed on their content or account. As part of their obligation to “engage, in good faith” with ODS, platforms should provide this statement of reasons to ODS in connection with a user’s appeal. This will allow the ODS to determine if the relevant appeal is within its scope and ensure that it is applying the appropriate norm(s) to issue a decision in the case. The information would ideally include (i) whether the piece of content was removed because it was considered illegal or incompatible with the platform’s terms and conditions; and (ii) in case not compatible with terms and conditions, the relevant policy and specific policy area the content was found to violate.

In terms of how information and decisions are tracked as between the platform, user, and ODS, it is worth noting that the DSA Transparency Database uses a Platform Unique Identifier (PUID) mechanism. Each statement of reasons is assigned a PUID which allows the enforcement decision to be matched to an entry on the Database.  As part of their obligation to “engage, in good faith” with ODS, platforms should utilize a unique reference number such as the PUID to allow disputed enforcement actions to be tracked from the platform to the ODS and back. Such a mechanism will also allow for the operation of the provisions in Article 21(2) whereby a platform can decline to engage “if a dispute has already been resolved concerning the same information and the same grounds of alleged illegality or incompatibility of content.

Users of VLOPs are likely to drive the bulk of disputes to ODS simply by virtue of the relative size of their EU user bases. Smaller platforms will have fewer resources to expend on automated solutions of this type. Nevertheless, given the anticipated number of disputes likely to come before ODS, scale will be a critical factor for all platforms to consider in relation to Article 21.

III. Managing abuse

Given the cost structure set out by Article 21 (where platforms bear the cost of dispute resolution conducted by ODS, even where the user ‘loses’ their dispute, although not if the user manifestly acted in bad faith), the system is potentially open to abuse. At the significant scale suggested by the European Commission’s DSA Transparency Database, abusive appeals – whether repetitive, frivolous, or attempts to game the system – could drive major cost and operational burdens for ODS and platforms alike and undermine the integrity of the system envisaged by Article 21. There are several ways in which this potential for abuse could be mitigated.

First, ODS could reduce the risk of abuse by charging users a nominal fee for submission of an appeal. Article 21(5) allows for this. A suitable fee could deter users from submitting frivolous or bad-faith disputes, rendering case volumes more manageable for ODS. At the same time, where the user is successful in their appeal, Article 21(5) ensures the platform will be required to reimburse the nominal fee. Given this reimbursement obligation and the need for user fees to be “nominal”, ODS should also charge reasonable participation fees to platforms under the mechanism envisaged under Article 21(5)) (and should align with in-scope platform(s) as to how such fees should be settled). When setting out their fee schedules, ODS will have to balance questions of reasonableness, affordability, fairness, and independence, while considering the incentives these factors create.

Second, ODS should require users to submit a statement explaining why they have chosen to dispute the platform’s enforcement action. Some users may choose to submit a detailed explanation with reference to appropriate norms; others may give a short account of why they believe the platform’s enforcement decision was unjustified. In any case, requiring a user statement will encourage users to set out their good-faith reasoning for the dispute.

Third, users should first deploy platforms’ internal appeals mechanisms under Article 20 before bringing their dispute to an ODS, and ODS should implement this condition as part of their “clear and fair rules of procedure” (Article 21(3)(f)). This reflects the practice of existing alternative dispute resolution bodies in the EU, and has numerous advantages as a standard for ODS, including:

  • The user first taking the opportunity to swiftly obtain redress from the platform directly;
  • The platform having an opportunity to resolve the dispute before it goes to an ODS (thereby helping ODS manage overwhelming volume);
  • The ODS having the benefit of knowing that the platform has not been able to resolve the appeal internally before reviewing the case (and thereby understanding that the case meets a certain threshold of legitimacy and seriousness); and
  • The ODS’ reported data consistently reflecting cases that have already been through all levels of internal review at a platform (and thereby having greater utility to regulators, researchers, and others referring to that data).

Fourth, ODS should consider deploying technological solutions to weed out spam, as well as triaging and stacking solutions to bundle appeals of the same content by different users.

Lastly, although Article 21 and ODS are not mentioned in the measures and protections against misuse in Article 23, the provision under Article 21(5) for users reimbursing the platform’s ODS fee or other expenses if they have manifestly acted in bad faith provides another avenue to deter abuse, although this should be interpreted conservatively and used exceptionally.

Together, these protections can ensure that disputes submitted to ODS reach a reasonable level of materiality, reducing abuse and bad-faith actors taking advantage of the mechanism envisaged by Article 21.

IV. Expertise

ODS will need to ensure that they have expertise “in one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platform” (Article 21(3)(b)). Given variance between EU Member State laws, it may be that national actors (whether private or public) are best placed to demonstrate expertise in and apply those laws to disputes as ODS in their own Member State of establishment. However, based on data in the European Commission’s DSA Transparency Database, the majority of platform enforcement actions are carried out based on platforms’ content policies, and it seems likely that the majority of disputes users wish to bring before ODS will turn on the application of those policies. Whether illegal or policy violating content, ODS will also need to have access to required linguistic and context expertise.

Unlike laws, platform content policies are subject to relatively frequent changes at platforms’ discretion, and each platform’s content policies are unique to its specific services. ODS will need to demonstrate how they plan to engage with platforms, understand their terms and conditions and the way that platforms apply and enforce them, as well as maintain appropriate arms-length training to stay abreast of policy changes. Platforms should enable such an exchange as part of their obligation to “engage, in good faith”, always bearing in mind Article 21’s requirement of independence. Commercial outsourced content moderation firms have created precedents for external application of platforms’ terms and conditions, including at scale, and ODS will need to draw lessons from these precedents while avoiding the perceived drawbacks (which have been criticised for high error rates, unreasonable operational targets, and financially motivated decision-making).

V. Accessibility

Article 21 obliges ODS to offer dispute settlement that is “easily accessible, through electronic communications technology and provides for the possibility to initiate the dispute settlement and to submit the requisite supporting documents online” (Article 21(3)(d)). Moreover, as required under Article 17(3)(f), accessibility for users will require platforms to provide clear and user-friendly information about ODS and their options for raising a dispute. As such, platforms will need to ensure that users can easily find and engage ODS.

VI. Data Protection

Given the scale at which ODS may operate, it is likely that they will be controllers of significant volumes of user data, received both directly from users and from platforms. The subject matter of content enforcement (including hate speech, nudity, and harassment) suggests that some of this data will be sensitive. Designing and deploying interoperable systems that ensure a high standard of protection for users’ and platforms’ data, tracking the requirements of the General Data Protection Regulation (GDPR) and applicable industry standards, will be complex. ODS serving users of multiple platforms will also need to ensure robust standards of data protection and segregation. Even off-the-shelf tools will need to be licensed and adapted to suit ODS purposes. As such, ODS, platforms, and DSCs should set expectations about data protection, security, and technical standards for such tools.

VII. Reporting

One of the key benefits of ODS to the wider DSA ecosystem will be the data they report out. Data on the types of disputes submitted to ODS by users, and how those disputes are resolved, will allow regulators, platforms, researchers, civil society organizations and other actors to identify and target critical content governance issues for intervention. However, this will only be possible if ODS (in cooperation with DSCs) can establish a common approach to the definitions, form and accessibility of the data they report, along with the minimum data fields they report.

Establishing such a standard will allow regulators and other stakeholders to aggregate data reported out across different ODS, reducing fragmentation, and giving as broad a picture of ODS activities across different regions, platforms, and areas of expertise. Additionally, this may underpin appropriate engagement between ODS, regulators and other actors in the content policy ecosystem, such as the Oversight Board.