The Digital Services Act and the implications for news media and journalistic content (Part 1)
By Doris Buijs, student researcher at DSA Observatory.
Introduction and background
During the legislative journey of the Digital Services Act (DSA), a major debate arose over the protection of news media and journalistic content online from “arbitrary practices” by online platforms. It was argued by established media organisations, such as the European Broadcasting Union (EBU), that when news media organisations and journalists put content on online platforms, it “regularly gets interfered with by those platforms”. In a well-known example, Facebook even blocked and prevented the sharing of a Guardian article which showed prisoners in Australia in chains, because of Facebook’s rules on nudity. Despite the debate, a “media exemption” in the form of media receiving prior notice from platforms before content moderation decisions did not make it into the final version of the DSA. It is interesting to see how the Media Freedom Act will turn out, as the recently published proposal does have a similar provision, as Article 17(2) states that Very Large Online Platforms (VLOPs) “shall take all possible measures” to “communicate to the media service provider concerned the statement of reasons accompanying that decision … prior to the suspension taking effect.” Despite of what will eventually become of the Media Freedom Act (as it still needs to be adopted), the media exemption discussion around the DSA stands for a larger issue: the protection of content from established news organisations and journalists online. Luckily, the debate has not been in vain, as several amendments were made to the DSA in favour of news media and journalistic content. News media organisations and journalists can invoke some procedural rights, either specifically aimed at protecting media freedom, or make use of general provisions providing all users of online platforms with avenues to contest content moderation decisions made by online platforms. Platforms may also be required to promote media content in order to tackle disinformation due to the DSA’s risk assessment framework. Although platforms have some discretion in formulating effective risk mitigating measures, one of those measures could be promoting diverse content and news from authoritative sources and established media. This way, platforms could kill two birds with one stone: enhance trust in the media and respect freedom of expression and media freedom.
However, in the Commission’s original DSA proposal, nowhere was news media or journalists mentioned. And a discussion arose, among other things, about a coveted media exemption: news media organisations, NGOs and politicians pled for a media exemption, where online platforms would be prevented from exercising control over editorial content. Indeed, on 15 January 2022, the week in which MEPs would vote on the final amendments of the DSA, amendment 511 was proposed by 46 MEPs. This amendment was supposed to place an obligation on platforms to give media organisations prior notice by ensuring that media organisations are informed and “have the possibility to challenge any content moderation measure before its implementation” (emphasis added).
Supporters of the proposal argued how the media exemption would ensure that news media organisations – that abide by several EU and national regulations, as opposed to online platforms – remain editorially responsible. This way, media freedom would be ensured, and platforms would need to give prior notice to media organisations before taking down their content. Ilias Konteas, director at EMMA/ENPA, two organisations that represent Europe’s magazine media and newspapers publishers, described how without the media exemption “the boundaries of press freedom would no longer be defined by law, but by private companies”. Critical comments mostly came from organisations involved in tackling disinformation. According to them, the exemption would delay the process of tackling disinformation online, as media organisations would be able to contest an upcoming decision to take down content containing disinformation, keeping the content online during the process of contestation. European Commissioner Věra Jourová even described the media exemption as “good intentions leading to hell”.
As the media exemption proposed by Amendment 511 did not make it in the DSA, the purpose of this blog post is to ask how media content is protected under the DSA, and what the implications of the DSA are for news media and journalistic content. A second blog post will focus on how the DSA may contribute to the protection of the safety of journalists online.
Before jumping into the main issue, an important note on terminology, as the definition of “(news) media” and “journalism” is topic of ongoing discussion and both terms are quite contested. As such, where reference is made to “journalists” or “journalism”, the blog post refers to the definition of journalism as given by the UN Human Rights Committee in General Comment no. 34: “Journalism is a function shared by a wide range of actors, including professional full-time reporters and analysts, as well as bloggers and others who engage in forms of self-publication in print, on the internet or elsewhere”. And where reference is made to content from “(news) media organisations”, the blog post refers to the much debated notion of “media” as adopted by the Committee of Ministers of the Council of Europe in Recommendation CM/Rec(2011)7, which applies a set of criteria for identifying media, and which has been applied by the European Court of Human Rights (ECtHR) in its case law. For further exploration of these terms, see this study on media plurality and diversity online.
This blog post starts with the discussion of the DSA provisions providing possible procedural rights specifically for news media and journalistic content. Afterwards, some general DSA provisions not specifically aimed at protecting (news) media and journalistic content, but nevertheless possibly helpful to news organisations and journalists will be discussed.
Specific procedural protection of news media and journalistic content
Article 14: Terms and conditions
A first provision providing some possible protection is Article 14(4) DSA, according to which platforms are required to have “due regard” to freedom of expression and media freedom under the EU Charter of Fundamental Rights when applying and enforcing restrictions on content under their terms and conditions. Further, under Recital 47, VLOPs must also have “due regard” to media freedom and media pluralism when “designing, applying and enforcing” their terms and conditions. This is important, as platforms used to be able to perform a lot of power and have quite some autonomy, when drafting their terms and conditions, based on the freedom to conduct a business as laid down in Article 16 of the Charter. For example, when a platform decides to take down a news article, or to shut down a journalist’s account, they may only now do so under Article 14 DSA if they take into consideration freedom of expression and media freedom. For journalists and the media, this can be regarded as a positive development. The very first DSA proposal did not mention media freedom or pluralism, so to have the obligation to pay due regard to media freedom and media pluralism in Article 14 can be regarded as a compromise for the lack of a media exemption. Although there will be no prior notice to media organisations or journalists when applying (restrictions based on) platforms’ terms and conditions, platforms should consider the media’s special position (it does however remain unclear what having “due regard” means). Nevertheless, there is now a legally binding fundamental rights assessment obligation for platforms when applying their terms and conditions. The interpretation of fundamental rights possibly extends to the ECtHR’s – extensive – caselaw under Article 10 of the European Convention on Human Rights, as the CJEU has found that Article 11 of the Charter is to be given the “same meaning and the same scope” as Article 10 ECHR, including as “interpreted by the case-law of the European Court of Human Rights”.
Further, Article 14(1) obliges platforms to inform their users about their possible restrictions based on their terms of service. This way, users in general, and journalists and the media thus as well, can to some extent forecast what platforms will (not) allow and why. When significant changes are being made to the terms of service, the platform will have to inform users (Article 14(2)), so users don’t have to regularly check platforms’ terms and conditions for any major changes.
General procedural protection
Article 14 is in fact the only procedural provision that specifically mentions the media. The following Articles do not specifically mention the freedom and pluralism of the media, but they give all users of online platforms procedural rights. As journalists and media organisations can be users of those platforms themselves as well, they could also make use of these provisions.
Article 17: Statement of reasons
Another provision that will strengthen the procedural rights of journalists whose content has been interfered with, is Article 17 DSA. It prescribes how all online platforms (not only VLOPs) should provide a statement of reasons when imposing restrictions such as the removal, or demoting of content, restrictions on the visibility of content, disabling access to content, suspension or termination of the provision of the service of the user’s account. This is an opportunity for journalists to find out why their content has been removed or their account has been blocked, in case the general information about the terms and conditions does not suffice. For example, when the content is removed because of a breach with the terms and conditions of the platform, the platform must specify on what specific contractual ground this action is based and why the content is incompatible with its terms and conditions. The same goes for interfering with online content in case of alleged illegal content. When stating the reasons, the platform should refer to the legal background as to why the content is considered illegal.
However, a platform’s obligation to state their reasons “shall apply at the latest from the date that the restriction is imposed, and regardless of why or how it was imposed” (emphasis added). And in this short sentence lies the reasoning for the media exemption as proposed in the earlier mentioned amendment 511, and its prior notice towards (news) media organisations and journalists. This wording of Article 17(2) means that platforms can wait until the very last moment to inform users that they will moderate or take down the content or shut down the account completely. For regular users, this can be unfortunate, but not problematic in a broader sense per se. But this is quite different for journalists and news reporters. The very essence of news is (that it is) time related; something that has news value today, can be forgotten by tomorrow. The ECtHR has underlined the value of news in the case of Observer & Guardian v. The United Kingdom, emphasising that “news is a perishable commodity and to delay its publication, even for a short period, may well deprive it of all its value and interest.” Takedown of media content for even the slightest amount of time can make a “news” article worthless: once it’s been put back up on the platform after contesting the platform’s decision, it may not be news anymore. Platforms should be aware of the fact that the longer they take to decide on the request to put an article back online, or to re-open an account, the more problematic it becomes for media organisations. Indeed, a media organisation’s reputation may be damaged where a news article is taken down by a platform, but after a complaint by the media organisation, gets put back online again. Users see articles popping up, being taken down, and put back up again. This doesn’t help increasing society’s trust in the media; in fact, it opens the door to suspicion. This could be tackled by platforms putting up a correction or clarification, explaining why the platform has handled the way it did, but why it also reversed its action. This way, media organisations or (individual) journalists don’t have to defend their content; the platform explains it to their users. Although the DSA is not a media law, but a broader law regulating online services, the DSA unfortunately does stay silent on this topic.
Article 20: Internal complaint-handling system
Importantly, Article 20 DSA requires platforms, with the exclusion of micro and small enterprises (Article 19(1)), to implement an internal complaint-handling system that enables users to lodge complaints. Those complaints can either be against decisions taken by the platform upon a notice deriving from the notice and action mechanism (Article 16) or a decision based on the fact that the content is allegedly in breach with its terms and conditions (Article 14). Such a system should be free of charge (Article 20(1)). The complaint can be lodged within six months following the decision, starting on the day on which the user is informed about the platform’s decision (Article 20(2)). Article 20(2) refers to Article 16(5), which states that the provider shall “without undue delay, notify that individual or entity” and to Article 17 to ensure that the user is actually aware of the platform’s decision. This obligation only applies to platforms when the relevant electronic contact details are known (Article 17(2)). It is not entirely clear how the platform then should notify and inform the media organisation, and if informing a media organisation per definition means that the media organisation is in fact aware of the interference with their content or account. The scenario in which a media organisation regularly checks whether their content has been interfered with in the beginning is plausible, but they will probably not check it every now and then months after publication, even though it is possible that a few months after publication, a news item (wrongfully) gets labelled as disinformation by a trusted flagger.
Be that as it may, the right to make use of an internal complaint-handling system is nevertheless an important procedural right for journalists, as they’re not forced to go to (an expensive) court to have a piece of content re-instated or account re-opened.
Article 21: Out-of-court dispute settlement
Another procedural step providing users an option before having to go to court is the out-of-court dispute settlement. Once a user has lodged a complaint, but that complaint has not been resolved by the internal complaint-handling system, a user is entitled to select a certified out-of-court dispute settlement body in order to resolve disputes relating to the decision causing the complaint (Article 21(1)), although the out-of-court dispute settlement body cannot impose a binding settlement on the dispute (Article 21(2)). Just like trusted flaggers, the out-of-court settlement bodies shall be certified by the Digital Services Coordinator of the Member State in which the settlement body is established (Article 21(3)). Regardless of the right to an out-of-court dispute settlement body, users can contest a decision by a platform (again, micro and small enterprises excluded) at any stage by initiating proceedings before a court (Article 21(1)). One of the dispute settlement bodies could be something along the lines of the Facebook Oversight Board. Up until now, this has been a self-regulatory body providing users with the option to appeal a decision taken by Facebook. Journalists too can appeal decisions, such as the decision taken on 1 February 2022 in which the Oversight Board overturned Meta’s decision to remove a post by a Swedish journalist reporting on sexual violence against two minors. It is still a very open question though whether the Oversight Board would meet the requirements of the DSA (think of independence, the fact that the Board can pick and choose the case they will handle). You can read more about the Oversight Board, the instalment of out-of-court dispute settlement bodies and their possible deficiencies here.
Risk assessment applied to journalism
Apart from the procedural possibilities to protect media content, the EU legislators have also incorporated a risk governance system into the DSA. In fact, one of the key characteristics of the DSA is its risk-based approach. Specifically, VLOPs and Very Large Online Search Engines (VLOSEs) have to comply with Articles 34 (risk assessment), 35 (mitigation measures) and 37 (independent auditing).
Two of the mentioned risks in Article 34 are especially relevant to media content, as Article 34(1)(b) and (c) state that any actual or foreseeable negative effects for the exercise of fundamental rights, in particular freedom of expression and information (including freedom and pluralism of the media) and foreseeable negative effects on civic discourse are possible systematic risks. VLOPS and VLOSEs must therefore identify the risks stemming from their service and carry out risk assessments. In order to mitigate those identified systemic risks, VLOPs and VLOSEs are obliged to “put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks”.
One can imagine how wrongful takedowns of media content, suspension or termination of journalists’ account by VLOPs can lead to negative effects for the exercise of freedom of expression, media freedom and media pluralism, especially if this happens at a larger scale. Those actions both interfere with the journalist’s individualistic right to express her/his/their opinion, as the public’s right to receive and impart information, as enshrined in Article 10 ECHR. The large amount of takedown decisions by platforms could logically lead up to a systemic risk in the sense of Article 34, either sub b or c.
Recital 90 states that VLOPs should involve representatives of the groups potentially impacted by the systemic risks stemming from their services when conducting their risks assessment and designing their mitigation measures. VLOPs should also test their assumptions for their risk assessments “with the groups most impacted by the risks and the measures they take”. This could very well be journalists and/or media organisations, considering Article 34 and the mentioned systemic risks for media freedom, media pluralism and freedom of expression in general. It should be noted however that it remains rather vague who decides who those impacted groups are, how “group” is defined and when this will become clear. Apart from the time frame and when we can expect platforms to take action in protecting news media and journalistic content, the combination of Article 34 and Recital 90 could mean that news media and journalists, in case they are one of those most impacted groups, should be involved in the risks assessments.
Tackling disinformation as a positive obligation
As platforms have a legal obligation to tackle negative effects on media pluralism, disinformation comes into play, as it can be a large danger to media pluralism. Disinformation has no clear legal definition in the DSA and doesn’t necessarily qualify as illegal content as defined in the DSA. As some member states have introduced national legislation, making (forms of) disinformation illegal, but other member states have not, the rules on disinformation for media content remain rather vague under the DSA. Perhaps it is better to focus on the positive obligation for platforms to tackle disinformation (in case this disinformation is a systemic risk in the sense of Article 34) by promoting diverse content and authoritative journalism on matters of public interest. This is also stipulated in Commitment 18 and Measure 18.1 of the strengthened Code of Practice on Disinformation and by Irene Khan, UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression in her report from April 2022: “The best way of countering disinformation is not through censorship or banning of outlets but through the promotion of free, independent and pluralistic media.” As such, a risk mitigation measure for tackling disinformation under Article 35 could be the promotion of media content and adapting algorithmic systems to promote media freedom.
It will be exciting to see whether platforms will actually use their power to promote media freedom and pluralism via their algorithmic and recommender systems. The same goes for established media organisations and journalists; to what extent will the DSA provisions provide practical and effective tools for media organisations and journalists to keep their content accessible online? Hopefully, more will be known when the first VLOPs will be designated, the first reports will be published, and the first caselaw will be analysed.