The DSA proposal and Denmark
Jesper Lund
(IT-Pol Denmark)
Disclaimer:Dear reader, please note that this commentary was published before the DSA was finalised and is therefore based on an outdated version of the DSA draft proposal. The DSA’s final text, which can be here, differs in numerous ways including a revised numbering for many of its articles.
The position of Denmark on the DSA proposal
The Danish government must seek a mandate from the European Affairs Committee of the Danish Parliament on all EU legislative dossiers. In the Danish government, the DSA proposal is primarily handled by the Ministry of Industry, Business and Financial Affairs.
Denmark supports the European Commission’s intention to modernise the legislation for intermediary services. The Danish government emphasises that the proposal will secure a safer and more responsible platform economy through common rules for the removal of illegal content and redress mechanisms for users whose content has been removed.
Striking the right balance between removal of illegal content and freedom of expression is important for Denmark. The DSA should only provide rules for harmful content in special cases, e.g. transparency requirements for disinformation in connection with the European Democracy Action Plan (EDAP). Platforms should not be required, under the DSA, to remove content from news media organisations and public service providers with an editorial obligation. The Danish position does not seem to include a “must carry” obligation for news media content, so voluntary content moderation would still be possible.
According to the Danish government, the notice and action system in Article 14 should have clearly defined timelines for platforms to act on illegal content, and these timelines should be differentiated according to the nature of the illegal content, so that content with substantial harmful effect (e.g. terrorist content and CSAM) is removed as quickly as possible. Very large online platforms (VLOPs) should play a more proactive role in removing illegal content, including a stay-down obligation for content that has been identified as illegal, although this should not create a general monitoring obligation. At the same time, it is important that the DSA does not create an incentive for platforms to remove legal content. In this regard, the Danish government does not believe that short timelines will create a risk that legal content is removed.
The position is different for orders issued by public authorities in other Member States to remove content under Article 8 of the DSA proposal. Here, Denmark is concerned that this provision could lead to over-removal of content due to cultural and legal differences in what is considered illegal in different national legislations. Similar to the Terrorist Content Online (TCO) Regulation, cross-border removal orders in the DSA will likely give rise to constitutional concerns for Denmark, since the Danish constitution prohibits other States from directly exercising authority on its territory, e.g. by issuing orders that are legally binding on Danish companies. The Danish constitution only allows such powers to be delegated to international authorities such as the European Union, but not to other States. In order to safeguard the functioning of the internal market, Denmark finds that the exception to the country-of-origin principle (cross-border removal orders in Article 8) should only apply to harmonised EU rules.
The current Danish position on the DSA does not seem to include amendments to the liability exemptions for intermediary services. However, in its response to the Commission’s DSA consultation (September 2020), the Danish government advocated for a duty of care obligation, where online platforms would only be covered by the liability exemption if they are taking measures that could reasonably be expected to proactively detect and remove illegal content on their services and collaborate with governments in a transparent manner.
The Danish government is worried that illegal content could migrate to smaller platforms if enforcement efforts are stepped up on larger platforms. Therefore, small and (possibly) micro enterprises should not be exempted from the requirement, set out in Article 21 DSA proposal, to notify law enforcement authorities of serious criminal offences. According to the same logic, small and micro enterprises should also be subject to the requirement in Article 19 DSA proposal to process notices from trusted flaggers with priority and without delay. Regarding exclusions from certain DSA provisions, Denmark suggests that they should be based on the number of users of the intermediary service, similar to Article 17(6) of the Copyright Directive (EU) 2019/790, instead of the criteria that define micro and small enterprises in Recommendation 2003/361/EC (turnover, size of balance sheet and number of employees).
Denmark supports a dedicated chapter in the DSA for online marketplaces with stricter liability rules. The proactive obligations for VLOPs to prevent illegal content should include, besides the stay-down obligation mentioned above, screening the products offered on the platform against the RAPEX database (EU alert system for dangerous products). In terms of intermediary liability, the online marketplace (irrespective of size) should be responsible for compliance with consumer protection, marketing and product liability rules if it is not completely clear to the average consumer that he/she is not buying from the online marketplace (platform). After considerable pressure from consumer protection and trade organisations, a majority in the Danish Parliament adopted a resolution that online marketplaces should assume the legal role of EU importer for the product if the seller is based in a third country and the buyer is a consumer. The Danish government has recently updated its DSA position to reflect that.
For aspects of the DSA other than platforms’ obligations for handling illegal content, the Danish position is much less detailed. Denmark emphasises the importance of greater transparency for online advertising and stricter rules for advertising targeted towards children. On algorithmic transparency, competent authorities should have access to algorithms for VLOPs. Denmark supports extending access to platforms’ data beyond vetted researchers (as in the DSA proposal), to also include Member States’ authorities. For VLOPs, the Danish government argues that the Commission should have an “active role” (not further specified) in the enforcement of the DSA.
Finally, Denmark wants a clarification in the DSA that Member States can maintain national rules, e.g. obligations for platforms to remove illegal content within specific timelines. Denmark plans to introduce such rules at the national level (similar to the German NetzDG) before the DSA is adopted, as discussed below.
The Danish government’s position does not contains any details about which authority will be selected for the critical role of Digital Service Coordinator. There is a parliamentary question about this matter, which has not yet been answered by the Minister for Industry, Business and Financial Affairs.
The broader legal, political and institutional context
Until a couple of years ago, the Danish political landscape was generally characterised by a laissez-faire type of attitude towards the digital economy and Big Tech, highlighting the advantages of online services and downplaying its risks, e.g. the vast collection of personal data and profiling of individuals. A 2013 report from the Danish Business Authority about big data as a growth factor painted an optimistic view of the data generation by smartphones and social media, where regulation was identified as a barrier for the development of the data-driven economy. In 2017, the Ministry of Foreign Affairs appointed a tech ambassador in Silicon Valley, and the newly appointed ambassador called social media a gift to countries of the size of Denmark.
The public sector has embraced cloud computing services offered by Big Tech, largely ignoring the data protection challenges raised by the Schrems rulings of the European Court of Justice. Many public schools use Google Suite for Education as the digital learning platform. Since the early years of Facebook, the Danish public broadcaster (DR) has used and promoted the social media platform very actively as a way for viewers to engage with content from DR, a move which has arguably contributed to the widespread use of Facebook in Denmark. Facebook has become the de facto platform for political discussions. Politicians at the national and local level use Facebook almost exclusively for addressing the public online and running political campaigns, often with paid advertising on Facebook. The Prime Minister has recently been criticised by journalists for preferring to address the public directly on Facebook instead of giving interviews to news media.
However, the political discourse has changed significantly over the last couple of years, especially with respect to Big Tech and dominant social media platforms. Several factors have contributed to this shift in the public conversation on platform regulation. News media reporting has portrayed social media as a place where illegal content, e.g. image-based sexual abuse of women, misinformation, harmful content such as self-injury and suicide postings, hate speech and harassment are spreading rapidly. Members of the Danish Parliament have called for stronger enforcement against illegal content on social media platforms inspired by the German NetzDG law.
Large social media platforms, and in particular Facebook, are criticised for not removing illegal content sufficiently fast and, at the same time, for removing legal content based on seemingly arbitrary interpretations of the platforms’ own community standards. The liability exemption for hosting services in the e-Commerce Directive is often referred to as an excuse by social media platforms not to take greater responsibility for the illegal content shared by users.
There is also growing political concerns over the power of Big Tech on Danish society, in particular the gatekeeping role exercised by big platforms, which results in almost every type of content – from news media articles, content for children and culture – to be accessed through platforms like Facebook. In August 2020, YouTube was heavily criticised for temporarily removing Danish music when the licensing agreement with rightsholders lapsed, and a new agreement could not be reached (even though this is a requirement under Article 17 of the Copyright Directive). There is broad political support for more democratic control over Big Tech whose platforms are regarded as Danish companies, to be regulated through Danish law, even though the companies are established in other Member States. From this perspective, the perceived problem is not so much the huge centralised power of dominant platforms over the lives of individuals, but rather that this power is not under democratic control (meaning, in practice, government control).
The current Danish government is formed by the Social Democrats, which is also the party of Christel Schaldemose, the rapporteur for the DSA in the European Parliament. In the annex of her draft DSA report, Schaldemose lists the Danish Ministry of Business Affairs among the entities from which she has received input. The positions of Schaldemose and the Danish government on fixed timelines for removal of illegal content – which are not included in the final IMCO compromise text voted on 14 December 2021 – are very similar.
Recent legal and political developments
In June 2021, the Danish government published a report “Towards a better society with tech giants” which covers the following issues: democratic control of Big Tech’s business models, support for the democratic conservation (e.g. avoiding echo chambers on social media), responsibility for user content presented on platforms, fair compensation for Danish content creators (music on YouTube and online advertising are mentioned), workers’ rights, protection of children and financial contributions to the Danish society (digital tax). The report recognises the inherent dilemma between asking platforms for more content moderation of illegal content and protecting freedom of expression. Going forward, the Danish government foresees a dialogue with the tech industry to address these issues and notes, with a quote from Mark Zuckerberg, that the tech industry itself has asked for clear regulation to balance the dilemmas of privacy, competition, free speech and safety.
Two months later, in August 2021, the report was followed by an action plan with a number of planned legislative initiatives, one of which is clearly overlapping with the DSA proposal. A national law proposal, which is expected to be presented to Parliament in February 2022, will require large social media services to remove manifestly illegal content within 24 hours and other illegal content within 7 days, similarly to the German NetzDG Act. The proposal will also contain a redress mechanism for content moderations decisions by social media platforms. In addition, there will be a social media cooperation forum where representative of the largest social media companies and Danish authorities can discuss issues such as the spread of illegal and harmful content. This part of the proposal seems to be inspired by the EU Internet Forum.
The action plan does not consider whether these measures will be compatible with the DSA, which is expected to regulate notice and action procedures in Article 14 (with or without fixed timelines for removal of illegal content) as well as redress options for users against content moderation decisions. However, it is apparent that the Danish government has plans for regulation of social media services that rely on national law in combination with the DSA, a strategy that is bound to create legal challenges and conflicts with the EC since EU law takes precedence over national law. The emphasis put on the country-of-origin principle in the Danish government’s DSA position is completely absent from the government’s plans to regulate large social media services with national law.
This contribution is part of an independent research project on mapping the position of and situation in the Member States with respect to the Digital Services Act (DSA), for which the DSA Observatory has received a (partial) financial contribution from our partner in this project, EDRi.