The DSA proposal and Germany

Naomi Appelman

(Institute for Information Law, IViR – University of Amsterdam)

The position of Germany on the DSA proposal

Recent reporting shed light onto the German position in the EU negotiations for the Digital Services Act (DSA) (see Netzpolitik, Tagesspiegel and Euractiv). Central to its position in these negotiations is the possible tension between Germany’s own relatively extensive platform regulation (see the following sections) and the EU-wide platform regulation the DSA aspires to create. According to the reporting, the fear is that the DSA will offer weaker protection than the German regulations and frustrate Germany’s own attempts to regulate online platforms. As such, the German criticism of the DSA focussed, on the one hand, on ensuring enough space for national legislation and, on the other hand, on increasing the protections for citizens and accountability of internet platforms in the DSA.

For example, the reporting highlights how Germany lobbies for  carve outs for “for deviating national regulations to promote cultural and linguistic diversity and to ensure pluralism”, and “to combat hate speech, protect minors and public safety” (see the Tagespiegel article). Similarly, Germany is critical of the proposed standardisation of removal orders and would rather see these orders to act against illegal content left to national law. As to the desired additional elements of the DSA, Germany aims to include measures to address “dark patterns”. These should be combatted by mandating fair and user-friendly digital services. Further, Germany would like to see messenger services included within the scope of the DSA as these can function akin to social media platforms. Finally, German calls for imposing stricter liability rules specifically for online market platforms such as Amazon.

The broader legal, political and institutional context

Germany stands out at the EU level for its extensive and proactive online platform regulation. This brief overview sets out to introduce the most important aspects of the legal, political and institutional context of German platform regulation. It focusses solely on the federal level and will not go into the complexities of  the ´Länder´. This means the analysis is necessarily limited. For example, media policy is no federal issue but is taken up at the regional level with the specific legislation (the Medienstaatsvertrag) as well as the relevant supervisory authorities (the Landesmedienanstalten) all operating on the Länder level. Still, there are important players and developments on a federal level that will be discussed and that can give insight in the German context. The overview is divided into a description of the main actors, relevant legislation and current developments.

To start with the federal government, the most relevant ministries are the Ministry of Justice and Consumer Protection (Bundesministerium der Justiz und für Verbraucherschutz), the Ministry of Economic Affairs and Energy (Bundesministerium für Wirtschaft und Energie) and the Ministry of Transport and Digital Infrastructure (Bundesministerium für Verkehr und Digitale Infrastruktur). Further, the Federal Office of Justice (Bundesamt für Justiz), which falls under the Ministry of Justice and Consumer Protection, is relevant as it is responsible for the monitoring and enforcement of the NetzDG (the 2017 German online speech law, further discussed below) and is one of the candidates discussed in the media for becoming the German ´Digital Services Coordinator´ as described in the DSA.

Further, there are two types of supervisory authorities in Germany especially relevant for platform policy. Firstly, there are the 14 different media supervisors (Landesmedienanstalten), who supervise regulation in the context of broadcast, radio as well as media platforms. Even though these supervisors operate on the level of their respective Länder, there are cooperation mechanisms in place to ensure the necessary unity in media policy and in international (mainly EU) cooperation. Importantly, such a shared position was also reached on the DSA. This position was overall positive, although it does warn against the lack of specificity of several provisions  and cautions that the proposed supervisory structure might undermine the existing EU-wide media supervision structures (e.g. ERGA). The other relevant supervisory authority is the Federal Network Agency for Electricity, Gas, Telecommunications, Post and Railway (the Bundesnetzagentur) as this Agency is responsible for the physical telecommunications infrastructure in Germany. Both supervisory authorities are said to be possible candidates (together with the Federal Office of Justice) for becoming the DSA’s ´Digital Services Coordinator’.

Besides government actors, Germany also has an active digital rights and related NGO landscape. Several organisations that play a role in the German (and often also the European) public debate on platform regulation are AlgorithmWatch, Digitale Gesellschaft and Datenschutzverein, (for other German EDRi members see here), and Hate AID. Many of these organizations actively engage with German federal policies related to platform regulation, as well as EU policy and legislative developments.

As to its actual platform legislation, Germany stands out for several reasons. To start, Germany has a unique and influential constitutional tradition. Of specific importance is the concept of ‘Drittwirkung’, which refers to third-party effect or (indirect) horizontal application of fundamental rights. With regard to platform regulation, this ‘Drittwirkung’ is especially interesting in the application of the  right to the freedom of expression (Article 5 of the German constitution) as it can be used to obligate social media platforms to carry certain content.

Then, the most influential and high piece of platform legislation in Germany is arguably the Network Enforcement Act (Netzwerkdurchsetzungsgesetz, or NetzDG). It was enacted in 2017 and is aimed at improving the enforcement of German speech laws online (for analysis see here, and here). The NetzDG introduced extensive transparency obligations and measures meant to improve the accountability of large social media platforms and to combat online hate. However, critics derided the very tight deadlines the law mandates for removal of unlawful content and feared a negative impact on the freedom of expression online. The NetzDG, additionally, created the possibility for ‘regulated self-regulatory measures’ to further comply with speech norms and policies. The ‘Voluntary Self-Regulation for Multimedia Service Providers’ (Freiwillige Selbstkontrolle Multimedia-Dienstleister), active since 2003, is one of the first institutions to be formally recognised by the Federal Office of Justice (Bundesambt für Justiz). It met all the necessary oversight requirements and now qualifies as ‘regulated self-regulation’. Finally, in the last year, two amendments  to the NetzDG have gone into effect. These two amendments have again been highly contested and clearly indicate that German federal platform legislation is still in active development. The first is  the ‘Gesetzespaket gegen Rechtsextremismus und Hasskriminalität’ which includes a reporting obligation for social media platforms (for context see here) and  the second is the ‘Gesetz zur Änderung des Netzwerkdurchsetzungsgesetzes’ aimed at strengthening user rights.

Other relevant federal legislation is the in 2018 updated State Treaty on Media (the Medienstaatsvertrag) which was amended to also address the selection and sorting mechanisms of search engines and social networks (for a discussion see here and here). Further, at the end of 2020 the federal government presented a proposed new law on cyber security and countering cyber criminality, the ‘Zweiten IT-Sicherheitsgesetzes (IT-SiG 2.0)’.  The proposal came into law in April 2021. Finally, of importance is also the recently enacted competition regulation (Gesetz gegen Wettbewerbsbeschränkungen (GWB, 2021)), aimed at the digital economy. This legislation stays clear of most of the terrain covered by the Digital Markets Act as it partially consists of an implementation of an EU Directive and focusses on merger control in Germany.

Recent legal and political developments

The most important recent development has been the September 2021 federal elections. Crucially, improved digitization was an important election topic as the speed of internet connections and the large discrepancies in internet access remain notoriously problematic in Germany. All major parties supported (a variation of) digital themes such as the digital infrastructure, cybersecurity, digital literacy and digitisation of government in their election programmes. The largest party, the Social Democrat Party (SPD), expected to play an important role in forming a coalition, focussed in its election programme on digitisation in the context of business, administration and education. Within these sectors, expanding and securing digital infrastructures, access to high speed internet and improving digital literacy were important themes. As such, it is to be expected that improving the digital infrastructure and internet access will be a government priority in the upcoming years.

Further, at the start of 2021 the federal government published a government data strategy. In it, the focus is on improving the digital infrastructure throughout the country, stimulating public use of government data and the improvement of the government´s own use and publication of its data and in its services. Another interesting legal development is the implementation of the new EU copyright directive, the Digital Single Market Directive, specifically Article 17, in German copyright law. After extensive public debate, the directive has been implemented over the summer.

Finally, a recent case of the German Federal Court of Justice (Bundesgerichthof), the highest court in civil and criminal matters, illustrates the importance of the German constitutional tradition. In July 2021 the Court ruled, based on an indirect horizontal application of the constitutional right to freedom of expression (Art. 5 Grundgesetz), that social media platforms have procedural obligations when removing content based on their terms and conditions. User should both be informed about any removal of their content or of an intended blocking of their account, as well as offered the opportunity to respond to such notice. This ruling is clearly in line with the intended reforms of the DSA proposal which, for example, in Articles 15 and 17 plans to implement similar measures.

As this overview shows, Germany has a unique and active political and regulatory debate on platform regulation. The German government has been, and continues to be, proactive in crafting specific platform regulation and there is a broad and very active civil society keeping a close eye on these developments. As a consequence, Germany and its specific approach to platform regulation is and remains an important influence in the European debate on platform regulation.

 

Naomi Appelman is a PhD researcher at the Institute for Information Law (IViR) interested in online speech regulation and platform governance. Her interdisciplinary research combines information law, specifically, online speech and platform regulation with (agonistic) political philosophy. More concretely, her research asks how European law should facilitate contestation of the content moderation systems governing online speech. The aim of facilitating this contestation is to minimise undue exclusion, often of already marginalised groups, from online spaces and democratise the power over how online speech is governed.

This contribution is part of an independent research project on mapping the position of and situation in the Member States with respect to the Digital Services Act (DSA), for which the DSA Observatory has received a (partial) financial contribution from our partner in this project, EDRi.