Redressing Infringements of Individuals’ Rights Under the Digital Services Act
Bengi Zeybek, Joris van Hoboken and Ilaria Buri
(Institute for Information Law, IViR – University of Amsterdam)
Introduction
Platforms’ content moderation decisions affect individual users and society at large in various ways. Content may be unduly removed or accounts may be suspended arbitrarily; politicians and influencers may be deplatformed, or artists may be shadowbanned. As a result, the rights of individual users can be infringed, and some groups’ voices can be disproportionately undermined.
If someone thinks there’s a violation of her rights online, what kinds of redress possibilities exist? The currently applicable intermediary liability law in the EU, the E-Commerce Directive, provides for safe harbours that shield intermediaries from secondary liability. Within the conditional immunity shield of the ECD, platforms have developed a wide range of remedies to redress wrongful actions against online content and activity at their discretion.
A specific private right of action for individual infringements is absent from the ECD and other content or sector-specific rules for digital service providers, such as the Regulation on preventing the dissemination of terrorist content online. Although the success of such claims remains limited, in cases of individual infringements, users can bring tort claims under national laws. There have already been cases dealing with undue removals and platforms’ terms and conditions before the courts of some Member States based on tort law, in conjunction with fundamental rights protection, such as the Netherlands (see, for example, here and here) and Germany (see, here and here).
What will be the implications of the Digital Services Act (“DSA”) on redress opportunities available to individuals online? The DSA will likely offer several significant improvements to ensure that appropriate remedies exist to safeguard individuals’ rights. Remedial improvements under the DSA can be grouped under two categories: (i) mandatory procedural mechanisms all online platforms must implement in their content moderation processes (Articles 14 – 20) and (ii) the right to lodge administrative complaints against intermediaries to the competent Digital Services Coordinator (“DSC”) for infringement of the DSA (Article 43).
The DSA does not explicitly create a separate private right of action for individuals for the protection of their rights; nor does it provide for actionable substantive rights for individuals. It introduces due diligence obligations for platforms that are procedural and systemic in nature, the scope of which differs based on the number of users of the platform (more extensive obligations apply to very large online platforms, “VLOPs”). These due diligence obligations are in principle subject to administrative oversight shared between the Digital Services Coordinators and the Commission (see here for an analysis of the regulatory powers in the DSA). The DSA is to apply without prejudice to any judicial remedies that are available under national laws.
This blogpost looks into the provisions of the DSA related to redress opportunities for infringements of the rights of individuals in two parts. In the first part, it gives an overview of the due process and dispute resolution mechanisms that platforms are required to integrate into their content moderation processes. In the second part, it takes a closer look at Article 43, which creates a right to lodge complaints for individuals. This provision is considered in combination with other provisions of the DSA, in particular Article 12, which requires platforms to apply and enforce their terms and conditions in a “diligent, objective and proportionate manner” with “due regard” to the “fundamental rights” of users (see here for a detailed analysis of Article 12). In doing so, it explores the scope of administrative complaints under Article 43 and implications of this provision, in combination with Article 12, on the private right of action for individual cases of infringements.
Due Process and Dispute Resolution Mechanisms Under the DSA
The DSA requires a set of procedural mechanisms to be implemented by platforms. In this regard, the DSA will be the most procedurally detailed intermediary liability law after the US Digital Millennium Copyright Act (DMCA). Building on the existing procedural operations of major platforms, procedural requirements in the DSA are to become core to platforms’ business practices in the EU. Below we lay out the due process mechanisms required by the DSA in detail.
To begin with, the DSA provides a harmonized notice and action mechanism to notify illegal content under Article 14. Notifications of illegal content must fulfil minimum information requirements for them to be actionable. These mechanisms must be ‘easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means.’ Requirements to improve user-facing design are part of other procedural provisions as well (Articles 15-18).
Upon receipt of a notification, the platform must confirm the receipt of the notice. The user that is the addressee of the decision taken upon the notice must be informed of the platforms’ action upon notification and redress possibilities (Article 14(5)). Where the processing and deciding on notices are fully automated (Article 14(6)), platforms must also inform the user submitting the notice of such automation.
In addition to submitting notifications, where allegedly illegal content is in question, users can also apply for injunctions to courts, or administrative authorities for the termination or prevention of an infringement under Article 5(4) of the DSA. If users can prove that the platform in question had knowledge or awareness of the illegal content or activity, for instance based on Article 14, they may also claim damages.
Further, the DSA foresees mandatory appeal mechanisms for users to challenge content moderation decisions concerning both illegal content and content or action that violates their terms of service. These procedural obligations are built on transparency reporting and information requirements (fundamental rights in terms and conditions – Article 12; transparency reporting for all intermediary service providers – Article 13, for online platforms – Article 23 and for VLOPs – Article 33). Where a platform decides to remove or disable access to specific content at its own initiative or upon a complaint by another user or a third party (e.g., law enforcement or trusted flaggers) platforms are required to provide users with a ‘statement of reasons’ for that decision (Article 15). Such a statement of reasons shall include redress possibilities in respect of the decision, in particular internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress possibilities available under the laws of the Member States (Article 15(2)(f)). It is important to note that the Council mandate on the DSA in Article 15, and similarly the EP mandate, propose to extend these procedural rights also to decisions concerning restriction of visibility or monetization of specific items of information.
In addition to the obligations to provide notice and action mechanisms and statement of reasons which apply providers of hosting services (Section 2 of Chapter III), online platforms (Section 3 of Chapter III) are obliged to provide internal complaint handling systems (Article 17) and out-of-court dispute settlement mechanisms (Article 18). Article 17 DSA requires online platforms to introduce internal complaint handling systems for their decisions to remove or disable access to illegal information or that violates platforms’ terms of service, suspend or terminate’ user accounts and/or of provision of services. Complaints subject to the internal complaint-handling system are to be handled in a “timely, diligent and objective manner” and they cannot be resolved by fully automated means (Article 17(5)). If a complaint being handled under Article 17 sufficiently demonstrates that the content in question is not illegal or is not against platforms’ terms and conditions, the platform shall reverse its decision (Article 17(3)). For complaints not resolved under the internal complaint handling system, platforms must provide an out-of-court body settlement under Article 18. These dispute resolution mechanisms will also be available for decisions to restrict the visibility or monetization of specific items of information if the Council and EP amendments to Article 15 find their way to the final text of the DSA.
The above-mentioned procedures and the use of platforms services in principle are allowed if they are used in good faith. Where that’s not the case, the DSA lays out the consequences. The use of platforms’ services and these due process mechanisms in bad faith shall be restricted under Article 20 (“Measures and protection against misuse”). Taking such measures is mandatory under the EC Proposal and Council mandate (“shall”), but under the EP mandate, platforms will have the discretion to do so (“shall be entitled”). Where a user frequently provides manifestly illegal content (Article 20(1) – the EP mandate requires only “illegal content”) or submits manifestly unfounded notices or complaints based on Articles 14 and 17 (Article 20(2)) online platforms shall stop the provision of services to these users. The exact conditions which could give rise to behaviour that constitutes misuse are to be determined by the policies of the platforms (Article 20(4)), with a possibility to take stricter measures in case of manifestly illegal content related to serious crimes (Recital 47). The EC Proposal and the Council mandate state, in Recital 47, that redress possibilities against measures taken against misuse should be available. Based on this wording, platforms can have discretion over whether to provide due process options mentioned above regarding platforms’ decisions based on Article 20.
To conclude this section, mandatory procedural mechanisms are important steps toward increasing user agency in content moderation and improving the accessibility of redress mechanisms. They can also benefit users to document and substantiate their claims in individual cases. Crucially, complying with these procedural obligations when enforcing terms and conditions, in combination with other due diligence obligations, such as transparency reporting and mitigation of risk obligations, will be core to safeguarding freedom of expression online.
Administrative Complaints for the Infringement of the DSA
As mentioned above, the DSA creates a ‘right to lodge a complaint’ for individuals under Article 43. This allows individuals to submit complaints against intermediaries alleging an infringement of the obligations found in the DSA to the Digital Services Coordinator (“DSC”) of the Member State where the user resides or is established. According to Recital 81, complaints under Article 43 “should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues.”
The wording of the Commission’s Proposal on Article 43 only refers to “recipients of the service” as competent persons to submit complaints. An explicit reference to civil society organizations is found in the corresponding Recital 81 – “individuals or representative organisations should be able to lodge any complaint (…)”. The Council mandate on the DSA adds “representative organisations as referred to in Article 68” to Article 43 to have a right to lodge a complaint. These representative organisations under Article 68 are bodies, organisations or associations that (a) operate on a not-for-profit basis; (b) have been properly constituted in accordance with the law of a Member State; and (c) include a legitimate interest in ensuring that this Regulation is complied with in their statutory objectives. Users can have their rights in Articles 17 (internal compliant mechanisms), 18 (out-of-court dispute settlement) and 19 (trusted flaggers) be exercised by these organisations too (EC Proposal). The EP mandate extends the scope of these rights to those found in Articles 8 (orders to act against illegal content), 12 (terms and conditions), 13 (transparency obligations for providers of intermediary services), 14 (notice and action), 15 (statement of reasons), 43 (right to lodge a complaint) and 43a (compensation – EP mandate).
In addition to creating a right to lodge a complaint, the EP mandate on the DSA introduces Article 43a titled ‘Compensation’. This provision allows users to seek compensation from providers of intermediary services against any damage caused due to providers’ failure to comply with their obligations under the DSA. Bear in mind that individuals are already entitled to damages based on tort under national laws.
What can we expect the application of these provisions to look like in practice? Article 43, considered in isolation, does not create a specific private right of action for individuals under the DSA to remedy infringements of individuals’ rights. At first glance, administrative complaints submitted under Article 43 will relate to the procedural rules mentioned above or to other due diligence obligations in the DSA. For instance, users can complain that they did not receive a clear and specific statement of reasons for a platforms’ decision to remove or disable access to her content (Article 15). Or they can claim that the reporting mechanisms were not easy to access and user-friendly (e.g. Facebook was fined by the Germany’s Federal Office of Justice because, among others, “NetzDG reporting form” was not made sufficiently transparent).
Nevertheless, claims about infringements of procedural rules (or other due diligence obligations) can be connected to claims about violations of individuals’ rights, for example where disputes relate to online speech. When complaining about a violation of a procedural rule such as a failure to provide a statement of reasons or to provide appeal mechanisms, individuals can also assert that such failure to comply constitutes an unjustified interference with their freedom of expression. In such cases, Article 43 may become a basis for a separate cause of action for infringements in individual cases.
The possibility to complain about individual infringements under the DSA could arise when Article 43 is considered in combination with Article 12(2) as well. Article 12(2) requires all platforms within the scope of DSA to act “in a diligent, objective and proportionate manner” with due regard to fundamental rights in the Charter of Fundamental Rights of the European Union when enforcing and applying their terms and conditions. As mentioned earlier, the DSA does not explicitly create directly actionable rights for individuals. But because of the unclarity as to how Article 12(2) is to be understood, as it currently stands, it can be constructed as a basis to complain about infringements in individual cases. For example, where an individual’s content is removed or otherwise is acted against based on a platform’s terms and conditions, she may claim that the decision was a disproportionate restriction of her right to freedom of expression and that in enforcing this restriction the platform did not comply with Article 12(2); meaning that the platform did not act “in a diligent, objective and proportionate manner” and with due regard to her fundamental rights. Along the same lines, she may claim that content subject to a dispute under the internal complaint handling system was wrongfully decided to be illegal or incompatible with its terms and conditions under Article 17(3), which can be considered as a specification of Article 12(2) obligations.
But it is hard to predict if these claims made about infringements of individuals’ rights under Article 43 would be successful in a meaningful way. The success of these claims depends, among others, on the scope of the DSCs’ powers and whether a DSC can and feels comfortable to decide on freedom of expression or other fundamental rights-related complaints. Another factor that makes it difficult to predict the success of these claims is the confusion as to Article 12(2) is to be understood and how Article 12(2) relates to procedural rules (Article 14 – 20), other due diligence obligations for online platforms and VLOPs and the enforcement provisions of the DSA.
Apart from the possibility for success of these claims, it is noteworthy to flag the risk that Article 43 will be weaponised in problematic ways such as coordinated complaints by trolls to ‘game the system’. In view of the limited resources available to the DSCs and the uncertainty on whether DSCs would have to hear all complaints submitted under Article 43, such coordinated use of the right to lodge a complaint could hinder the proper implementation of Article 43 and the protection of individuals’ rights online.
Concluding Remarks
In conclusion, the DSA makes significant steps forward to hold platforms accountable towards their users. By requiring platforms to implement procedural standards in their content moderation processes, it limits platforms’ ability to arbitrarily make decisions on third-party content and ensures that certain procedures exist for individuals to contest decisions made about their content and/or accounts. At the same time, individuals are given the possibility oversee platforms’ compliance with the DSA with the right to lodge complaints under Article 43. At this moment, much is unclear as to how Article 43 will work out in practice, including the DSCs’ powers and activities under this provision. One possibility that arises upon a closer look at Article 43, especially in combination with Article 12(2), is that it can be constructed as a separate private right of action for individuals to complain about the infringements of their fundamental rights online. Yet it is hard to predict if such complaints would be successful in the absence of more specific guidance on the Article 43 and its interplay with other provisions of the DSA.
Joris van Hoboken is a Professor of Law at the Vrije Universiteit Brussels and an Associate Professor at the Institute for Information Law, University of Amsterdam. He works on the intersection of fundamental rights protection (privacy, freedom of expression, non-discrimination) and the regulation of platforms and internet services. At the VUB, he is appointed to the Chair ‘Fundamental Rights and Digital Transformation’, established at the Interdisciplinary Research Group on Law Science Technology & Society, with the support of Microsoft.
Ilaria Buri is a research fellow at the Institute for Information Law, University of Amsterdam, where her work focuses on the “DSA Observatory” project. She previously worked as a researcher at the KU Leuven Centre for IT and IP Law (CiTiP) on matters of data protection and cybersecurity. She is admitted to the Bar in Italy and, prior to joining academia, she gained extensive experience as a practitioner in law firms and worked at the European Commission (DG Climate Action).