The DSA proposal and the Netherlands

Joris van Hoboken

(Institute for Information Law, IViR – University of Amsterdam)

The position of the Netherlands on the DSA proposal

The Netherlands supports the revision of the ECD framework and the main policy objectives and overall choices made by the European Commission in the proposal for the DSA. It attaches weight to the economic implications, the improvement of frameworks for illegal content identification and removal, as well as the protection of consumer safety online and respect for fundamental rights, in line with positions taken in the D9+ non-paper from 2020.

The Dutch government’s proposals for amendment of the DSA proposal have been relatively minor until now. The most significant position, expressed relatively recently toward the Council, is its support for specific monitoring obligations (these would amount to notice and stay down obligations that would remain in line with the case law and developments at the CJEU, the Glawischnig-Piesczek judgment in particular). In proposing this, the Dutch government still stresses its support for the ban on general monitoring and is apparently looking for clarifications to the proposal about the difference between general and specific monitoring obligations. A amendment in line with this position of the Dutch government was recently added to the current Council negotiation text in recital 28:

Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Such orders should not consist in requiring a service provider to introduce, exclusively at its own expense, a screening system which entails general and permanent monitoring in order to prevent any future infringement. However, such orders may require a provider of hosting services to remove information which it stores, the content of which is identical or equivalent to the content of information which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information, provided that the monitoring of and search for the information concerned is limited to information properly identified in the injunction, such as the name of the person concerned by the infringement determined previously, the circumstances in which that infringement was determined and equivalent content to that which was declared to be illegal, and does not require the provider of hosting services to carry out an independent assessment of that content. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or a general active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. (Council text 16 September 2021)

The broader legal, political and institutional context

The position on the DSA of the Netherlands is shaped by a variety of legal, political and institutional factors. Legally speaking, the issues in the Netherlands with respect to the ECD and its implementation are relatively unnoteworthy compared to some other member states. The implementation of the ECD stayed very close to the text of article 12-15 ECD and the case law on intermediary liability in the Netherlands contains little significant issues in relation tot the ECD framework and the case law of the European Court of Justice. The Netherlands has one law anticipating the DSA proposals shift to due diligence and regulatory oversight over structural obligations, discussed below.

One clear area of influence on the Dutch position (and important Dutch stakeholders) is the self- and co-regulatory system that has been created in the last decade in the Netherlands. In its communications with respect to the DSA, for instance in its non-paper, it clarifies the Dutch institutional setting and the role that is played by various relevant actors to address issues of illegal content online. These include several regulatory authorities, but more significantly a number of government supported and funded organizations in specific issue areas, such as the expertise centrum for CSAM (EOKM) and the hotline for illegal discriminatory internet content (MiND).

A central element in the Dutch policy context for the ECD and illegal content online has been the Dutch Notice and Takedown code, which was initially drawn up in 2008. The Code has been promoted at the EU level in the last decade. The code was draw up with the participation of internet industry, internet access providers (telecoms sector), rights holders and the government. Among the notable elements of the Code is the attempt to stress the subsidiarity principle in notice and takedown processes, calling for notice and takedown processes to happen as close to the illegal content as possible. An addendum with respect to combatting CSAM was adopted in 2018, in response to calls by the Dutch Government.

The legal and policy framework for Child Sexual Abuse Material in the Netherlands features an important role in the discussions around illegal content online. The Netherlands has consistently featured at the top of the list of countries in which CSAM is hosted and distributed. In policy circles, this state of affairs is typically connected to the leading position of the Netherlands in internet infrastructure for Europe, which is considered an important asset for economic reasons. Due to a combination of factors including the favorable business climate (in general), the direct connection to submarine cables between the continents, the world leading Amsterdam Internet Exchange (also considered the third mainport) and the good digital infrastructure more generally, the Netherlands is an attractive country for data centers and hosting providers. In the last years, new instruments have been developed by the sector, including better monitoring instruments for CSAM material, the addendum to the Notice and Takedown code, and hash sharing. Critiques have pointed out that law enforcement may not have placed enough priority on actual investigations and enforcement in this area.

Recent legal and political developments

The current Dutch government (new coalition discussions are under way since March 2021 and the prospects for a majority government coalition remain precarious) has taken initiative with respect to CSAM in ways that align with new due diligence obligations in TERREG and the DSA. In particular, a draft law has been put out for consultation in early 2021 that would create a new independent administrative authority that would have the power to order takedowns of CSAM material, oversee a more general obligation for hosting providers to “take suitable and proportionate measures to limit the storage and distribution of CSAM through their services” and to provide information on such measures taken to the new authority on CSAM. Infringements of these obligations are subject to potential administrative procedures, including fines. The authority is also to play a role in data sharing and the management of hash sharing collaborations. The law is of particular relevance to the DSA proposals since, if set up, it would be one of the potential candidates for becoming the Digital Service Coordinator as proposed in the DSA (and taking up the new regulatory oversight obligations relating to the terrorism content regulation. The draft law received push back from several sides, including the judiciary, which stressed the lack of judicial review on removal orders and the questions this raises for the protection of the right to freedom of expression. The law has not been introduced in parliament yet.

Another area of legal and political development in the Netherlands has been related to the challenges people face to get unlawful content actually removed and the campaign to improve the legal framework in this regard. In a study (Dutch, English Summary), the current state of affairs was analyzed and various potential improvements were considered. The study concludes that the DSA is an important venue for making improvements to the procedures followed by service providers, considering the codifications of notice and takedown standards. It also stresses the heterogeneity of the problem of illegal content online and the limitations of several legal routes for removal from an access to justice perspective. The lack of good information for citizens and legal support stand out as an important area of potential improvement.

In the last year, there have been several notable lower court judgements with respect to removals of terms of service removals by dominant service providers (in particular YouTube) in the area of disinformation. In the disinformation policy discussions in the Netherlands more generally, much emphasis has been placed on the right to freedom of expression and the Netherlands has advocated for a relatively light approach (transparency) with no core new legislation being tabled in the last years. In Court, Google has defended several high profile removals in court against claims that the removals were unlawful and an interference with the right to freedom of expression. While the Courts have not dismissed the claims altogether, they have not developed particularly strong hooks for similar claims to succeed in the Netherlands, pointing to the fact that dominant service providers like Google are implementing disinformation policies partly in response to demands placed on them by policy makers (e.g. the EU Code of Conduct) and in line with guidance of international organizations, and relying on the right to conduct a business and the right to property of service providers as a basis for discretion over content moderation on their services.

As mentioned above, the Dutch internet sector is well developed and consists of a combination of a local small and bigger national players, as well as international actors, with some notable platform services, such as Uber, having their main European establishment in the Netherlands. For the digital sector, the platform organizations ECP (Platform for the Information Society) and the more recently established DINL ( Digital Infrastructure Netherlands) play an important role in policy discussions related to internet regulation and the internationally dominant platforms tend to be well connected to these discussions as well.

As regards civil society organizations active in these areas, Bits of Freedom has managed to play an important role in intermediary liability discussions over the years in the Netherlands, with an emphasis on the issue of due process and the protection of fundamental rights in notice and takedown processes and the fight against legal filtering mandates on access providers and in the new copyright directive. In the context of IPR enforcement, Stichting Brein is a dominant factor in intermediary liability debates and developments.

As with other countries, the position of the Netherlands can be expected to develop further in response to EU level dynamics. The lead on the DSA negotiations for the Netherlands is held by the Ministry of Economic Affairs, but many other departments have some relation to the file, in particular the Ministry for Justice and Security. In the Dutch Parliament, there is some push for the Dutch government to also take up issues around online tracking (with new restrictions or a potential ban, with a leading role for some Dutch MEPs in the campaign for tracking free ads) but the government has not shown any appetite of taking up these positions in its position in the Council, referring to a lack of evidence in relation to this policy problem and the impact of such measures. The Dutch Green Party have recently asked a series of questions related to the proposals in the Dutch Senate.

 

Joris van Hoboken is a Professor of Law at the Vrije Universiteit Brussels and an Associate Professor at the Institute for Information Law, University of Amsterdam. He works on the intersection of fundamental rights protection (privacy, freedom of expression, non-discrimination) and the regulation of platforms and internet services. At the VUB, he is appointed to the Chair ‘Fundamental Rights and Digital Transformation’, established at the Interdisciplinary Research Group on Law Science Technology & Society, with the support of Microsoft.

 

This contribution is part of an independent research project on mapping the position of and situation in the Member States with respect to the Digital Services Act (DSA), for which the DSA Observatory has received a (partial) financial contribution from our partner in this project, EDRi.