The Digital Services Act and Its Impact on the Right to Freedom of Expression: Special Focus on Risk Mitigation Obligations

Joan Barata

This blog post synthesises the main arguments and conclusions developed in a report commissioned to the author by the Plataforma en Defensa de la Libertad de Información (PDLI), fully available here.

 

Introduction

The UN Human Rights Council declared in its resolution 32/13 of 1 July 2016 that “(…) the same rights that people have offline must also be protected online, in particular freedom of expression, which is applicable regardless of frontiers and through any media of one’s choice, in accordance with articles 19 of the UDHR and ICCPR.” In doing so, it recalled its resolutions 20/8 of 5 July 2012 and 26/13 of 26 June 2014, on the subject of the promotion, protection and enjoyment of human rights on the Internet.

Human rights law has been traditionally applied to the relations between individuals and States. The latter have the obligation not to establish unnecessary and disproportionate limits to the mentioned fundamental right, and to ensure enabling environments for freedom of expression and to protect its exercise. This being said, new standards have been formulated aiming at extending the application of some of the mentioned protections to the relations between private individuals and, particularly, those between individuals and corporate businesses. The United Nations Guidelines on Businesses and Human Rights[1] constitute an important international document in this area, although it is neither binding nor even soft law. In his 2018 thematic report to the Human Rights Council[2], the United Nations Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression David Kaye directly addressed platforms, requesting them to recognize that “the authoritative global standard for ensuring freedom of expression on their platforms is human rights law, not the varying laws of States or their own private interests, and they should re-evaluate their content standards accordingly”.

Almost every State in the world has in place a set of national rules governing the dissemination of ideas, information, and opinions, online and offline.

Besides this, hosting providers do generally moderate content according to their own – private – rules. Content moderation consists of a series of governance mechanisms that structure participation in a community to facilitate cooperation and prevent abuse. Platforms tend to promote the civility of debates and interactions to facilitate communication among users[3]. Platforms adopt these decisions on the basis of a series of internal principles and standards. The amount of discretion that global reaching companies with billions of users have in order to set and interpret private rules governing legal speech is very high. These rules have both a local and global impact on the way facts, ideas and opinions on matters of relevant public interest are disseminated.

In this context, EU Commissioners Margrethe Vestager and Thierry Breton presented in mid-December 2020 two large legislative proposals: the Digital Services Act (DSA) and the Digital Markets Act (DMA) [4].

The DSA is intended to refit the 20-year-old E-Commerce Directive[5]. The DSA does not repeal the basic provisions established in the E-Commerce Directive, and particularly the principle of liability exemption for intermediaries. It also incorporates new important rights for users and obligations for service providers (particularly the so-called very large online platforms: VLOPs) in areas such as terms and conditions, transparency requirements, statements of reasons in cases of content removals, complaint-handling systems, and out-of-court dispute settlements among others. Several provisions included in the DSA may have a relevant impact vis-à-vis the right to freedom of expression of users and third parties. In particular, this post essentially focuses on three types of provisions: action orders from relevant authorities, notice and action mechanisms, and assessment and mitigation of systemic risks.

 

Action orders, international standards and third countries’ legislation

Article 8 regulates possible orders to service providers from relevant judicial and administrative national authorities to act against a specific item of illegal content, on the basis of the applicable Union or national law, and in conformity with Union law. The territorial scope of these orders will be determined by the competent authority “on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law”. The only possible limitation comes from the reference to the fact that orders must not “exceed what is strictly necessary to achieve (their) objective”. This article enables national authorities (including administrative bodies) to unilaterally impose a specific interpretation of international freedom of expression standards not only within the territory of the Union but also vis-a-vis third countries. In addition to this, the mentioned provisions do not contain any specific safeguards permitting the access to and consideration of the specific circumstances, impact and consequences that the adoption of measures against a certain piece of content may have within the context and legislation of a third country, and particularly vis-à-vis recipients of the information in question.

 

Notice and action mechanisms

Article 14 of the proposal regulates notice and action mechanisms. Although the basis of the notice and action mechanism is the existence of a specific illegal content item, the DSA deliberately refrains from providing a definition of what would be considered as “illegal” in this context, and in general, in the context of the overall Regulation. This vagueness and broadness may trigger over-removals of content and affect the right to freedom of expression of users, as illegal content as a broad category may present very diverse typologies. Therefore, without necessarily assuming the task of defining what content is illegal, the DSA would need to establish the obligation for notifiers to determine not only why a certain piece of content is considered to be illegal, but to properly substantiate the circumstances, context and nature of the alleged violation of the law. It would also be important to introduce an additional provision establishing that when considering notices and before adopting any decision regarding disabling access or removal, hosting providers are entitled and required to make their own good-faith assessment on the basis of the principles of legality, necessity and proportionality.

 

Mitigation and assessment of systemic risks

Very large online platforms (VLOPs) as defined by article 25 of the proposal, i.e., those which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, will need to assume under the DSA new duties to assess and mitigate “systemic risks”. The existence and nature of such risks is not clearly described or proven by the legislator. Recital 56 proclaims that platforms “can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause”.

The risks mentioned in article 26 are very much connected or related to general societal risks (which would exist with or without the intermediation of online platforms). The extent to which and the manner these risks may be aggravated and how this aggravation can be realistically and properly assessed by platforms is something which is only vaguely presented in the proposal thus creating a high degree of uncertainty for platforms, as well as granting an excessive discretion to relevant authorities.

Article 27 regulates the different ways such risks are to be mitigated by platforms. According also to article 27, measures adopted to mitigate the systemic risks need to be “reasonable, proportionate and effective”. However, considering the complexity of the tasks assigned to online platforms in the first instance, these general principles do not provide much clarity or foreseeability regarding the measures and practices to be implemented.

Recital 68 establishes that “risk mitigation measures (…) should be explored via self- and co-regulatory agreements” (contemplated in article 35) and in particular that “the refusal without proper explanations by an online platform of the Commission’s invitation to participate in the drawing up and application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation”. Such determination is particularly implemented via enhanced supervision mechanisms in the terms of article 50.

It needs to be noted that in many cases the only possible way to deal with systemic risks and/or respect the rules established via the mentioned codes may require the use of automated filtering mechanisms. Without prejudice to the transparency obligations included in the DSA regarding the use of such mechanisms, it is important to note here that errors by automated monitoring tools can seriously and irreversibly harm users’ fundamental rights to privacy, free expression and information, freedom from discrimination, and fair process. However, the DSA does not contain any clear and binding directive to guide the design and implementation of this type of measures, particularly when it comes to human rights implications.

It is also necessary to point at the absence of any relevant provision establishing the need that platforms, co-regulatory mechanisms and oversight bodies properly consider the impact on human rights, and particularly freedom of expression, that the implementation of the mentioned mitigation measures may entail.

In addition to this, in the European model, the establishment of restrictions to the right to freedom of expression by non-legislative bodies is connected to the presence of an independent body not subjected to direct political scrutiny or guidance. The very important role that a non-independent body like the European Commission may play vis-à-vis the articulation and implementation of measures with a clear impact on speech is in contradiction with this model.

Last but not least, and from a more general point of view, there are no specific provisions requiring that platforms’ internal and independent processes and audits incorporate a clear, international law-based and thorough human rights impact perspective, particularly in the area now under consideration.

 

Conclusion

The DSA constitutes a very relevant and comprehensive proposal. It establishes a series of fundamental rules and principles regarding, essentially, the way intermediaries participate in the distribution of online content. It also incorporates new important rights for users and obligations for service providers (particularly VLOPs).

However, some provisions, and particularly those related to duties and responsibilities regarding the assessment and mitigation of systemic risks, may have an unnecessary and disproportionate impact on the right to freedom of expression of users.

Therefore, possible amendments to articles 26 and 27 need to promote a focus on clearly defined illegal content, rather than broader categories of so-called harmful content. Morevoer, any possible due diligence obligation aimed at preventing the dissemination of illegal content need to be commercially reasonable, transparent, proportionate, more principled than prescriptive, and flexible. Such obligations should not focus on the outcomes of content moderation processes, i.e. intermediaries should not be evaluated on whether they have removed “enough” illegal content, as this creates a strong incentive towards over-removal of lawful speech. In order to facilitate the effectiveness of such measures, intermediaries might be subject to ex ante regulatory oversight, receive support and assistance from regulators, civil society and other major stakeholders, and engage in the adoption of codes of conduct.

Provisions related to the matters mentioned in the previous paragraph would need to establish the obligation to undertake solid and comprehensive human rights impact assessments. As a matter of principle, any private or regulatory decision adopted in this field would need to guarantee that human rights restrictions/conditions have been properly considered on the basis of the principles of necessity and proportionality. In addition to this, the DSA would also need to establish proper mechanisms for ex post assessment of due diligence measures implementation, as well as proper appeal mechanisms for all interested parties regarding all relevant decisions adopted in this field by competent bodies.

 

Joan Barata is a a Fellow at the Cyber Policy Center of Stanford University. He works on freedom of expression, media regulation and intermediary liability issues. He teaches at various universities in different parts of the world and has published a large number of articles and books on these subjects, both in academic and popular press. His work has taken him in most regions of the world, and he is regularly involved in projects with international organizations such as UNESCO, the Council of Europe, the Organization of American States or the Organization for Security and Cooperation in Europe, where was the principal advisor to the Representative on Media Freedom. Joan Barata also has experience as a regulator, as he held the position of Secretary General of the Audiovisual Council of Catalonia in Spain and was member of the Permanent Secretariat of the Mediterranean Network of Regulatory Authorities. He is a member of the Plataforma de Defensa de la Libertad de Expresión (PDLI) in Spain since 2017.

 

[1] These principles were developed by the Special Representative of the Secretary-General on the issue of human rights and transnational corporations and other business enterprises. The Human Rights Council endorsed the Guiding Principles in its resolution 17/4 of 16 June 2011. Available at: https://www.ohchr.org/documents/publications/guidingprinciplesbusinesshr_en.pdf

[2] Available at: https://www.ohchr.org/EN/Issues/FreedomOpinion/Pages/ContentRegulation.aspx

[3] James Grimmelmann, “The Virtues of Moderation”, 17 Yale J.L. & Tech (2015). Available online at: https://digitalcommons.law.yale.edu/yjolt/vol17/iss1/2

[4] https://ec.europa.eu/digital-single-market/en/digital-services-act-package

[5] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market.