Report on workshop – Online journalism: Digital Services Act and European Media Freedom Act, 23 February 2023

On 23 February 2023, the DSA Observatory and the AI, Media and Democracy Lab organised an online workshop on online journalism and the role of online platforms. Several expert speakers discussed the Digital Services Act, (Article 17 of) the European Media Freedom Act (EMFA), and the safety of journalists and protection of news media content online. Afterwards, a discussion followed with an invited expert audience.

A report of the workshop can be found below, and here in pdf: Report workshop on DSA, EMFA and online journalism – 23 February 2023

 

 

Report on the workshop

Online journalism: Digital Services Act & European Media Freedom Act

“The DSA’s capacities for free media and safe journalists online: realistic or utopian?”

Held 23 February 2023, 15:30 – 17:00 (CET).

 

Introduction

This report summarises the discussion held at the workshop jointly organised by the DSA Observatory and AI, Media and Democracy Lab’s on Thursday, 23 February 2023, on the implications of the Digital Services Act (DSA) and the proposal for a European Media Freedom Act (EMFA) on online journalism. The workshop consisted of two parts: the first part focused on the safety of journalists online and the role of online platforms in journalists’ online safety, and the second part on the protection and availability of news media content online. Six experts discussed specific aspects of these topics in relation to the DSA and the EMFA, and a variety of questions such as: how we can use the tools in the DSA and the EMFA to create an environment that is favourable to media pluralism and resilient journalism?

The expert interventions were followed up with a discussion by an invited expert audience, consisting of academia, civil society, policy, industry and the journalism community. A main goal of the workshop was to bring together different societal stakeholders to discuss the implications that the DSA and the EMFA will have for the protection of (news) media content online and the safety of journalists.

Part I – Safety of journalists

The workshop started off with discussing journalists’ experiences with online harassment and coping with violent and death threats. It has become clear that online harassment has become part of the lives of many journalists, especially women journalists. In addition, violence that starts online, can continue (to have effects) offline. Questions that arose during the discussion revolved around how to deal with and effectively tackle these threats; taking threats offline through content moderation initiatives makes these threats ‘invisible’, but journalists’ safety may still not be secured. Thus, reporting the harassment to authorities is good, but much more is needed. As discussed during the workshop, police’s responses and judicial procedures are inadequate, bureaucratic and fail to provide effective remedies and protection for journalists who are confronted with online harassment. Although the problem of harassment against journalists, and in particular, against women journalists and other media actors who identify as part of minority and/or marginalised communities in society, has been broadly documented and reported on in the news, there is no formal solution yet.

The urgency of the need for solutions is underlined by the fact that the safety of journalists is not only important for media actors themselves; journalists fulfil an extremely important role in our society – they are the public watchdogs of our democracies. The discussion  continued by focusing on the Council of Europe’s perspectives on the safety of journalists because of their important societal position, and how Council of Europe standards may inform the operationalisation of the DSA.

In the European legal context, Article 10 of the European Convention on Human Rights (ECHR) plays a central role, and with that the caselaw of the European Court of Human Rights (ECtHR). Firstly, Council of Europe member states have a positive obligation to ensure a safe and favorable environment in which everyone can engage without fear, online and offline. Journalism is an open profession, and a variety of actors can engage in journalistic activities. All of those media actors should be able to do their job in a safe environment.

As platforms are widely and on a large scale deployed for the harassment against journalists, a multi-stakeholder approach is needed to find solutions for the above-described problems. Therefore, the ECHR and the ECtHR’s caselaw cannot (solely) be relied upon for solutions, as the Court is not a political body. In addition, the ECHR applies to Council of Europe member states, but not to private actors such as platforms. In that regard, some Recommendations as adopted by the Committee of Ministers of the Council of Europe might prove useful, such as Recommendation CM/Rec(2016)4 on the protection of journalism and safety of journalists and other media actors and Recommendation CM/Rec(2018)2 on the roles and responsibilities of internet intermediaries. Although these recommendations are not legally-binding, they are standard-setting and show the willingness of member states to invest in effective systems of protection at national levels.

One of the questions that surfaced was how the principles from the Council of Europe (ECHR, case-law, principles in policy instruments) can be put into practice. In the Dutch context, PersVeilig is an initiative by journalist organisations and law enforcement to strengthen journalists’ position and safety. Ideally, a similar protocol specifically focusing on online harassment as we have in place for physical harassment already in the Netherlands would be developed.

The discussion continued on how the Council of Europe framework surrounding journalists’ safety could be translated into EU legislation to solidify the multi-stakeholder approach – more specifically, how this framework can be operationalised in the DSA. The first panel was concluded by discussing a possible plan of the Dutch government to use some of the DSA’s mechanisms to address threats against journalists online. In June 2022, the Dutch government announced the plan to explore the use of trusted flaggers to improve the level of safety of journalists. This initiative is commendable and a potential (partial) solution to the problem of harassment against journalists, but as was underlined in the beginning of the discussion, taking threats and harassment offline is only a part of a solution – as it is only a response to the problem, rather than a preventive measure. Future resilient policy and legislation should be evidence-based and also differentiate between the different sorts of online threats and harassment, as to provide granular solutions. This may have implications for using certain mechanisms in the DSA, such as orders to act against illegal content (Article 9) or the notice and action mechanism (Article 16). The categorisation of threats in terms of whether they fall within the scope of (national) criminal or civil law might be of particular interest, given the recent proposal for a Directive on combating violence against women and domestic violence seems relevant, which, among other goals, seeks to harmonise cyber violence legislation. The proposal also explicitly refers to the DSA as it aims to make the DSA more effective in the context of the DSA’s due diligence obligations ‘by including minimum rules for offences of cyber violence’, and by criminalising certain forms of cyber violence, based on which national judicial authorities would be able to issue orders to platforms to act on those illegal types of content (Article 9 of the DSA).

Part II – Protection and availability of news media content online

The workshop continued with discussing the protection and availability of news media content online and a description of the current problems. Although it seems like substantial data on media content with which platforms have interfered (e.g., removal, suspension, or termination of accounts) may be lacking, there are notable incidents of European news media content being the subject of interference by platforms, and European journalists having been denied access to online platforms. Examples like these demonstrate how interfering with the availability of (news) media content can deprive the public of access to qualitative news and diverse media content. It shows why regulatory solutions are necessary to tackle arbitrary decisions by platforms and to protect fundamental rights.

It also makes clear that up until now, platforms were able to moderate content based on their terms and conditions in an untransparent manner, and platform decisions on media content have been a black-box scenario. In this regard, the protection of online content and fundamental rights were at the core of the DSA negotiations. Although Article 14 of the DSA (on platforms’ terms and conditions) has also been criticised, the provision will in particular potentially contribute to protection of online media content.  Article 14 requires platforms to apply their terms and conditions in a transparent and non-discriminatory manner and with due regard for fundamental rights such as the right to freedom of expression and the freedom and pluralism of the media. In addition, the DSA contains many reporting and transparency obligations, such as Article 15, 24 and 42. These transparency obligations could also help to further substantiate the problem of media content availability online.

The workshop continued to discuss a topic closely related to the availability of media content online, which is the so-called media exemption. A (form of) media exemption can create a privileged position, one based on which media can (to some extent) influence platform decisions on their content. The rationale would be that media content deserves a different status than ‘regular’ content, because of its important added-value to open, public debate. The media exemption did not make it into the DSA, but the debate has been brought back with the recently-proposed European Media Freedom Act (EMFA), and in particular Article 17 of the EMFA. Based on Article 17, Very Large Online Platforms (VLOPs) ‘shall take all possible measures’ to communicate a statement of reasons in case they decide to suspend the provision of its service to a self-declared media service provider, based on the fact that the media service provider’s content is incompatible with the VLOPs’ terms and conditions (but the content does not pose a systemic risk as referred to in the DSA). Although the discussion has mainly revolved around the media’s potential privileged role being problematic, there are also concerns that the provision might even give more power to platforms as Article 17 requires heavy content moderation technology.

In addition, several participants have stressed the media exemption’s risk of disseminating disinformation. In case media service providers are able to self-declare as such, platforms might be incentivised to leave that content alone, to comply with their DSA obligations to have due regard for the freedom of expression and media freedom and pluralism – which would potentially lead to the possibility of disseminating disinformation under the guise of it being ‘media content’.

It was stressed that these difficulties should also be taken into account when implementing rules on media content online and the implementation of the DSA. Although media protection online is necessary, during the discussion it was argued that Article 17 of the EMFA might not be the best way to address unjust media content removals. This was supported by the fact that the DSA already contains several measures to protect media content online, which would be an argument as to why Article 17 of the EMFA would not be ‘necessary’. Examples of DSA (procedural) measures would be the provisions on the terms and conditions – including user’s right to be informed on significant changes to the terms and conditions (Article 14), the internal complaint-handling system (Article 20) and the out-of-court dispute settlement system (Article 21). While the systemic risk mitigation framework as laid down in Articles 34, 35 and 37 does not provide procedural handles for journalists, it might nevertheless benefit online journalism and the availability of news media content. Article 34(1)(b) lists ‘any actual or foreseeable negative effects’ to freedom of expression and information, ‘including the freedom and pluralism of the media’ as a potential systemic risk.

Concluding remarks

The workshop was concluded by coming back to the question as posed at the beginning of the workshop: how we can use the tools in the DSA and the EMFA to create an environment that is favourable to pluralism and resilient journalism? In this regard, the discussion reinforced that the DSA does have a number of tools that may go some way to contributing to the protection of the safety of journalists online, and the protection of news media content. However, there is still considerable work to be done before February 2024, when the DSA comes into full effect, and nation regulatory authorities (Digital Service Coordinators) need to have been established, mechanisms such as trusted flaggers being implemented, to better assess how the specific provisions in the DSA will actually operate in practice for journalists. Further, the considerable debate around Article 17 EMFA and news media content is only going to become more intense as the legislative proposal progresses. Finally, the DSA Observatory and the AI, Media and Democracy Lab will continue to facilitate debate and discussion around the DSA, EMFA, media freedom and journalism; and welcome proposals for future workshops and discussion on these topics.

 

Sources referred to during workshop

Reports

Legislation

  • European Convention on Human Rights (ECHR) – Council of Europe
  • Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) – European Union
  • Proposal for a Regulation of the European Parliament and of the Council establishing a common framework for media services in the internal market (European Media Freedom Act), 16 September 2022 – European Union
  • Proposal for a Directive of the European Parliament and of the Council on combating violence against women and domestic violence, 8 March 2022 – European Union.

Policy documents

  • CM/Rec(2016)4 – Recommendation on the protection of journalism and safety of journalists and media actors
  • CM/Rec(2018)2 – Recommendation on the roles and responsibilities of internet intermediaries

Other

  • Journalism Trust Initiative – a trust indicator project with a code of conduct on self-assessment by media outlets, mentioned in Recital 33 of the EMFA in context of self-assessment media service providers (Article 17(1) EMFA).