Is the DSA moving in the right direction?

dr. Joan Barata

 

The process of adoption of the Digital Services Act (DSA) has just reached a very important and advanced stage. In the early hours of 23 April 2022, the so-called trilogue negotiations successfully ended with an agreement among the EU legislative bodies. The text proposed by the European Commission in mid-December 2020 was enriched by a series of amendments adopted by the European Parliament in January 2022. Everyone now awaits the final text to see how exactly the agreed compromises will play out.

Despite a flurry of social media activity from EU officials, the precise content of the agreements is still a mystery and what little we know comes from indirect sources, social media rumors and academic speculation. While it is understandable that high-level political negotiations must not necessarily be disclosed in order to guarantee the flexibility and efficiency of certain phases of the very complex EU lawmaking process, it is also important to note, from a very basic human rights perspective, that legislation that affects rights such as freedom of expression or privacy must be drafted in an open and transparent manner, guaranteeing that relevant stakeholders including civil society have the possibility to engage and warn legislators and policymakers about the implications of their choices.

The DSA negotiations saw a rush of very sensitive and last-minute provisions which had significant implications for online freedoms. These emerging provisions have been seen with particular concern by organizations such as EFF, CDT or Access Now, which have also warned about the urgency in achieving a deal in record time.

Problematic amendments include a new commitment for providers to process a majority of valid notifications for the removal of illegal hate speech in less than 24 hours. Provisions regarding notice and action mechanism under article 14 already contained issues to be addressed. It was necessary to add safeguards to avoid forcing platforms to make expeditious determinations regarding the legality of a piece of content under the threat of possible liability. The very short timeframe now introduced may create unrealistic constraints that will make platforms’ content decisions more cumbersome and lead to excessive removals.

The effective protection of the right to freedom of expression is also complicated by the incorporation of an “emergency mechanism” that would put in the hands of the European Commission even broader powers to determine how online platforms handle content in times of serious crises. There is still not sufficient information regarding the specific wording of this new provision. According to civil society statements, this mechanism would broadly empower the European Commission to unilaterally declare an EU-wide state of emergency and enable far-reaching restrictions of freedom of expression and of the free access to and dissemination of information. According to the mentioned accounts, the European Commission is only allowed to implement this measure upon ​​the recommendation of the Committee of National Coordinators of Digital Services. The Commission would also be obliged to inform the European Parliament and Council about the measures taken, which could be applied for a maximum timeframe of three months.

It is important to note, on this matter, that according to international human rights law states may indeed take measures derogating from certain of their human rights obligations and commitments under international and regional law instruments to the extent strictly required by the exigencies of the situation. However, such declarations are subject to very strict requirements and constraints, particularly the need to carefully consider the justification and reasons why such a measure is necessary and legitimate in the circumstances. Furthermore, States hold the legal obligation to narrow down all derogations to those strictly required by the exigencies of the situation. No measure in this field may be inconsistent with the way fundamental rights are protected by international and regional legal instruments. It is also necessary to add that, in some cases, emergency situations rather than requiring restrictions may demand that freedom of expression and unfettered access to public information are to be considered as powerful instruments within the context of the crisis. The information made available so far does not seem to suggest that the new provisions on the mentioned emergency mechanisms contain any these safeguards.

The ongoing crisis created by the illegal invasion of Ukraine by the Russian Federation may have inspired the adoption of these new rules. This is not completely new. The Council Regulation (EU) 2022/350 of 1 March 2022 concerning “restrictive measures in view of Russia’s actions destabilizing the situation in Ukraine” prohibits broadcasting or facilitating any content by the State-owned and controlled Russian media outlets, “including through transmission or distribution by any means such as cable, satellite, IP-TV, internet service providers, internet video-sharing platforms or applications”.

This was already a very problematic ad-hoc legislation for a variety of reasons ranging from the competence of national independent audiovisual regulators in this field, the use of a very broad and general assessment of the information provided by the mentioned outlets rather than specific and properly analyzed pieces of content, and the emergency and lack of proper consultation and participation in the adoption of the regulation. It would not be far-fetched to conclude that with this Regulation, European institutions have come closer to the type of decisions that authoritarian states such as Russia or Belarus have been adopting regarding the exercise of fundamental rights than to the practices of liberal democracies.

Will the new emergency mechanism generalize this kind of approach when it comes to platform regulation in the EU? If true, this would be very bad news.

It is important to underscore, in any case, that in the European model, the establishment of restrictions to the right to freedom of expression by non-legislative bodies is connected to the presence of an independent body not subjected to direct political scrutiny or guidance. The very important role that a non-independent body like the European Commission may play vis-à-vis the articulation and implementation of measures with a clear impact on speech is in contradiction with this model.

Secondly, the negotiations have also led to what are perceived as missed opportunities in a few relevant areas.

The general right to use services anonymously has not been incorporated into the final text. This also affects the protection of the use of end-to-end encrypted services and the prohibition on the use of legally mandated upload filters. Although platforms would still have the possibility of voluntarily adopt such protections, the disclosure of this omission has created frustration among privacy advocates. Such disappointment has only been increased by the recent legislative proposal from the European Commission that would require chat apps like WhatsApp and Facebook Messenger to selectively scan users’ private messages for child sexual abuse material (CSAM) and “grooming” behavior. In practice, these rules introduce nothing short of mass surveillance and it is incompatible with basic European and universal basic rights and freedoms, no matter the declared intentions. It has been declared as unworkable and invasive by privacy experts.

The agreement also missed the opportunity to clarify the scope and methodology for a safe and legally certain compliance with already existing provisions on systemic risks assessment and mitigation. Instead, a new category of broadly defined risks, identified as “gender-based cyber violence” has been the object of a last-minute agreement. The DSA thus still entrusts platforms with the responsibility to define and adopt measures, in the first instance, to mitigate the systemic risks, which need to be “reasonable, proportionate and effective”. In addition to this, platforms remain obliged to partner with the European Commission in the drafting of adapted content moderation practices to address these issues.

From a broader perspective, the final agreement also seems to have missed the opportunity to incorporate mandatory and extensive human rights checks and balances to be implemented by both relevant authorities and platforms as a fundamental component of any mandate or compliance mechanism.

On a positive note, it is also worth mentioning that the final agreement does not incorporate a proposed addition of online search engines within the scope of the DSA as a new category of “intermediary”, which would have created the obligation to delist search results, if not entire webpages, that had been flagged to have contained illegal content. Another wise choice was not embracing the proposal to include an exemption for “traditional media” from content moderation.

These discussions have taken place at the same time as an important judicial decision regarding the application of the right to freedom of expression within the context of platforms’ content moderation practices. The awaited decision of the Court of Justice of the European Union on the case of Poland v the Parliament and the Council (C-401/19) where the Court was expected to discern whether the obligation included in the so-called Copyright Directive, obliging online content-sharing service providers to review, prior to its dissemination to the public, the content that users wish to upload to their platforms, is compatible with the fundamental right to freedom of expression. The Court decided to dismiss the action brought by Poland based on the conclusion that such limitation is proportionate insofar as it is accompanied by appropriate safeguards by the EU legislature. In order for measures as those subjected to scrutiny to be acceptable under the freedom of expression clause enshrined in article 11 of the Charter, proportionality must be respected, as well as and a fair balance of rights and interests is struck between them.

This decision is not only important to the extent that it defines why mandated filters are legitimate in these cases, but also because it points at a general rule establishing that the use of upload filters is limited to situations where the risk of blocking legal content is minimal, as pointed out by Felix Reda in a recent analysis. The decision is also relevant in the sense that forcing service providers to make own assessments on the legality of a piece of content before being uploaded would mean in practice the imposition of a general monitoring obligation and, at the end of the day, represent an unnecessary and disproportionate restriction to the general right to freedom of expression.

The DSA is a remarkable and unique proposal in the field of online platform regulation. Proposals included in the initial text by the Commission and amendments introduced by the Parliament reinforce important and necessary duties of transparency, accountability, redress and, more broadly, protection of users’ rights. However, as it has been shown there are still some relevant areas that need attention. Other issues may also emerge when the full text of the agreement is finally released. Final advocacy efforts and expert contributions will be fundamental to guaranteeing a balance between human rights, business model diversity, market efficiency and adequate regulation, particularly in the context of our current turbulent times.

 

Joan Barata is a a Fellow at the Cyber Policy Center of Stanford University. He works on freedom of expression, media regulation and intermediary liability issues. He teaches at various universities in different parts of the world and has published a large number of articles and books on these subjects, both in academic and popular press. His work has taken him in most regions of the world, and he is regularly involved in projects with international organizations such as UNESCO, the Council of Europe, the Organization of American States or the Organization for Security and Cooperation in Europe, where was the principal advisor to the Representative on Media Freedom. Joan Barata also has experience as a regulator, as he held the position of Secretary General of the Audiovisual Council of Catalonia in Spain and was member of the Permanent Secretariat of the Mediterranean Network of Regulatory Authorities. He is a member of the Plataforma de Defensa de la Libertad de Expresión (PDLI) in Spain since 2017.