The DSA proposal and France

Ilaria Buri

(Institute for Information Law, University of Amsterdam)

The French position on the DSA proposal

It shouldn’t come as a surprise that France is playing a very active and central role in the current discussions on the DSA package. Not only is the DSA one of the defining proposals of this EU legislature, but the matters covered by the DSA proposal, and the very idea of reining in the power of Big Tech, have long been high priority for Paris. In May 2020, just few months before the European Commission unveiled its DSA proposal, France passed a very criticized online content moderation bill, the “Avia law” (later struck-down by the French Constitutional Council, as further discussed below), in a legislative move that created palpable tensions at the EU level.

Monitoring the French position on the DSA proposal is essential to understand how the DSA process is likely to evolve. France is one of the largest EU Member States and, as such, holds a very strong voting power in the Council. Most important, following the Portuguese and Slovenian presidencies of the European Council in 2021, France will take over the rotating presidency in the first semester of 2022. In addition, the period of the French EU Council presidency will coincide with a crucial institutional event at the internal level: the election, in May 2022, of France’s new president (preceded by a heated electoral campaign, where the questions of online harms and big tech’s power rank high in the candidates’ agenda).

It is no mystery that France is keen on reaching an agreement on the DSA package under its Council presidency. Given the presidential elections scheduled for May 2022, the finalization of this landmark piece of legislation would ideally materialize, for the French government, in the early spring 2022, during the last weeks of the electoral campaign.

One of the most heated issues in the current DSA negotiations, where France is being very active, relates to the country-of-origin principle, one of the cornerstones of the DSA proposal and of its e-Commerce directive predecessor. France advocated for a revision of this principle, according to which the national authorities of the country of establishment are exclusively competent for the supervision and enforcement vis à vis service providers (including VLOPs which offer their service in every Member States). Pointing at the failures occurred in the enforcement of GDPR, and fearing that a similar stalemate would likely continue under the DSA, France advocated for granting the authorities of the country of destination with specific powers of intervention and enforcement (particularly when it comes to online marketplaces). While the French initiative seems to be backed by other big Member States – such as Germany, Italy and Spain – several other Member States have very different views on how the country-of-origin principle should be articulated in the DSA Regulation. In reaction to the French stance on this principle, Ireland – together with other ten countries – signed a non-paper on “effective supervision in the DSA” and in defense of the (current) country-of-origin regulatory architecture. With regard to the supervision of automated content moderation tools, France supports the IMCO draft report which grants the EC with new powers to check the algorithms governing the moderation of online content.

In October 2021, the Slovenian presidency presented a compromise text on these questions, which entrusts the Commission with the power to intervene against VLOPs, in specific cases, when requested by national authorities. A first draft of the conclusions of the European Council meeting of October 2021 included a reference to Spring 2022, during the French Council presidency, for the final approval of the DSA. However, in a second draft, and in their final version, the conclusions of the European Council of 21-22 October 2021 do not commit to any specific deadline. Somehow less ambitiously, the Council calls on the co-legislators to continue the progress on a series of key legislative proposals – including the DSA and the DMA – in order to reach an agreement “as soon as possible”.

It has been voiced that the tensions around the country-of-origin principle are being determinant in the apparent slowing down of the DSA discussions. Besides creating frictions in particular with Ireland and Luxembourg, where most VLOPs are located, France’s proposal of enhancing the power of the authorities in the country of destination raises questions about the risk that countries struggling with rule of law issues could leverage this increased power to restrict freedom of speech and other fundamental rights. Furthermore, even in a DSA scenario with strengthened national authorities, effective enforcement towards VLOPs would not be guaranteed.

Post-EU Council media coverage reports that the French government is re-shaping its strategy around enforcement and the country of origin. Supported by countries like Germany, the Netherlands and Luxembourg, France advanced a new proposal where more centralized enforcement powers, exclusively towards VLOPs, would be attributed to the European Commission.

In any case, the forthcoming IMCO final DSA opinion – originally scheduled for adoption on 8 November, but postponed as on the same day Facebook’s whistleblower is speaking to the European Parliament – and the next Council compromise proposal will soon reveal whether the French amendments have gained enough momentum and support to make their way into the final texts and trilogue negotiations.

Recent developments on platform regulation in France

Like Germany (NetzDG), France already legislated in the area of illegal content in the run up to the DSA proposals. In May 2020, France passed a bill to tackle online hate speech, the so-called Avia law. Named after its rapporteur Laetitia Avia (“La Republique en Marche”), the Avia law introduced significant amendments to the law n. 579/2004, which transposes the e-Commerce directive in France. The law imposed two new obligations on the providers of online communication services: (i) the obligation to remove within 1 hour terrorism or child-pornography content notified by the administrative authority, punishing any failure to remove such content with a fine of 250.000 euro or 1 year imprisonment; (ii) the obligation to remove hateful content (material containing incitement to hatred, violence and offenses for reasons of race or religion) or sexual content signalled by anyone within 24 hours, with the failure to remove or make such content inaccessible subject to a fine of 250.000.

The Avia law resembled the DSA proposal, particularly in its most recent developments, in that it required the platforms to adopt the organizational and technological measures necessary to ensure the timely assessment and removal of flagged content. The Conseil Superieur de l’Audiovisuel (the French media regulator) was granted extensive powers to supervise the implementation of the law and sanction non-conformity (fines could be issued up to 20 million or 4% of the annual turnover, whichever the higher). Now, given the role attributed to it under the Avia law, it is likely that the Conseil Superieur de l’Audiovisuel will be the authority identified by the French government to take the role of Digital Services Coordinator under the DSA.

The Avia proposal was heavily criticized by digital rights activists for the significant censorship risk it would bring about. In particular, La Quadrature du Net and 75 other signatories argued in an open letter that measures tackling the ads-based business model of the big platform – such as mandatory interoperability – could be more effective than the Avia law in combatting the diffusion of hateful and dangerous content.

Just one month after its enactment, in June 2020, the French Conseil Constitutionnel struck down the Avia law, holding that all its main provisions (and particularly the removal deadlines) were in conflict with the French constitution. The Constitutional Council argued that the system introduced by the French legislator – characterized by the combination of extremely short removal timeframes and the threat of high fines – would inevitably result in the automatic removal of all flagged content. Consequently, it concluded that that the content moderation obligations introduced by the Avia law were disproportionate, unnecessary and inadequate – and therefore unconstitutional – for the interference they entailed on freedom of expression.

France and digital policies: broader institutional and societal context

A number of French public bodies are responsible for and contribute to the definition of different aspects of digital policies. The Secretary of State for Digital Transition and Electronic Communications (“Secrétariat d’État chargé du Numérique”), by delegation of the Prime Minister, coordinates a variety of policies concerning the digital transformation, the development of the digital economy and the improvement of digital skills. In collaboration with the other relevant ministries, it deals with questions of internet governance, digital infrastructures and security of network and information systems. Notably, among other competences, the Secretary of State for Digital Transition is in charge of developing the legal framework for digital technologies and platform regulation, at the national, European and international level and ensuring fundamental rights and ethics in the digital environment.

The French National Digital Council (“Conseil national du numérique”), was established in 2011as an independent consultative commission in charge of promoting an open debate on the many dimensions of the interaction between digital technologies and society. Active at the national, European and territorial levels, the National Digital Council is placed under the Secretary of State for Digital Transition and Electronic Communications. Its interdisciplinary college is composed of 17 members appointed by the Prime Minister (researchers, journalists, lawyers, entrepreneurs and policy-makers) and 4 MPs appointed by the presidents of the National Assembly and the Senate.

Furthermore, the Interministerial Digital Directorate (DINUM) is responsible for the digital transformation of the State for the benefit of citizens and civil servants in all its aspects (modernisation of the State’s information system, creation and quality of digital public services, etc).

A number of French civil society organizations have been active in the field of digital rights. La Quadrature du Net, founded in 2008, is committed to the defense of fundamental rights in the digital society, with a special focus on censorship and surveillance. Another important association is Creis-Terminal (created in 2010 from the merging of two organizations respectively founded in 1984 and 1979), which carries out studies and promotes debates on the societal impact of digital technologies. CECIL (Centre d’Études sur la Citoyenneté, l’Informatisation et les Libertés) is active in the critical study of the information society and in promoting public debate and citizen interventions in this field. These organizations are active in the Observatoire des Libertés et du Numérique (Freedoms and Digital Observatory), founded in 2014: other partners in the Observatory are French human rights organizations, such as the Ligue des Droits de l’Homme (Human Rights League, founded in 1898), the Syndicat de la magistrature, the Syndicat des Avocats de France, and Amnesty International France.

 

Ilaria Buri is a research fellow at the Institute for Information Law, University of Amsterdam, where her work focuses on the “DSA Observatory” project. She previously worked as a researcher at the KU Leuven Centre for IT and IP Law (CiTiP) on matters of data protection and cybersecurity. She is admitted to the Bar in Italy and, prior to joining academia, she gained extensive experience as a practitioner in law firms and worked at the European Commission (DG Climate Action).

 

This contribution is part of an independent research project on mapping the position of and situation in the Member States with respect to the Digital Services Act (DSA), for which the DSA Observatory has received a (partial) financial contribution from our partner in this project, EDRi.

Photo by Julien Doclot on Unsplash