The extraterritorial implications of the Digital Services Act

Laureline Lemoine & Mathias Vermeulen (AWO)

 

As the enforcement of the Digital Services Act (DSA) is gathering speed, a number of non-EU based civil society and research organizations have wondered to what extent the DSA can have an impact on their work. This blog post provides a concise overview of the areas and provisions within the Digital Services Act that are most pertinent to the issue of extraterritorial application of the Regulation.

General principle

The DSA – like the GDPR – explicitly states its applicability in an extraterritorial context, as the scope of the regulation covers “recipients of the service that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services have their place of establishment” (Article 2). Hence the DSA applies to companies outside the EU as long as they have users in the European Union.

Territoriality issues related to content moderation obligations

In order to escape liability for illegal content under the DSA, article 5 of the DSA requires platforms to act “expeditiously to remove or to disable access” to such content. Because illegal content is not defined in the DSA and its definition might vary between EU Member States, the Commission explains that where content is illegal only in a given Member State, as a general rule it should only be removed in the territory where it is illegal[1]. It can therefore be assumed that content will only be removed or access disabled within the territory of the EU.

This approach would be consistent with the application of the General data protection regulation (GDPR) and, for example, the judgement in Google v Commission nationale de l’informatique et des libertés (CNIL) of 24 September 2019, where the Court of Justice of the European Union (CJEU) held that Google does not have to remove search engine results worldwide in order to comply with a ‘right to be forgotten’ request under EU data protection law, effectively limiting the territorial scope of the EU right to de-referencing.

However, Member states might be tempted to take a different approach. In the defamation case of Glawischnig-Piesczek v Facebook of 3 October 2019, the CJEU imposed no territorial limitation on the removal or blocking of illegal online content in the application of the 2000/31 E-Commerce Directive.

Complaints

In terms of enforcement and redress, Article 53 of the DSA only allows for recipients of the services located or established in a Member State to lodge complaints.

Similar to the GDPR, there is no EU nationality requirement to lodge a complaint. Moreover, the term “location” does not imply a legal status like “residence” or “establishment” would and in that sense, temporary location should also be covered. Such users could however be represented by bodies, organisations or associations from outside of the EU.

It should be noted that although Article 53 allows users to complain about any infringement of the DSA, individuals are not conferred rights of action under the DSA, unlike GDPR. The mechanism would therefore be a way to alert the local digital service coordinators of any infringements. As recital 118 explains: “complaints could provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues”.

Different ways for non-EU civil society organisations to get involved in the DSA

There are a number of ways in which non-EU based organisations could become active in the enforcement of the DSA.

  1. Representation: non-EU stakeholders can represent service recipients established or located in the EU to lodge a complaint to a digital services coordinator. Concretely, an organisation based in Israel would be able to file a complaint on behalf of Jewish citizens in the EU, but an organisation based in Berlin that represents the Rohingya in Bangladesh would not be able to file a complaint on their behalf.
  2. Vetted researchers: non-EU stakeholders could apply to become vetted researchers (provided they meet the criteria of Article 40.8).
  3. Auditors: non-EU stakeholders could be appointed as an auditor by a Very Large Online Platform (VLOP) (provided they meet the criteria of Article 37.3)
  4. Risk assessment process: non-EU stakeholders could be part of the risk assessment process at the invitation of a VLOP (provided they are ‘impacted’ by the relevant issues or have expertise, as per recital 90)
  5. Codes of conduct: non-EU stakeholders could be involved in drawing up codes of conduct (provided they have relevant expertise, as per Article 45.2 and 46)
  6. Crisis protocols: non-EU stakeholders could be involved in drawing up codes (provided they have relevant expertise, as per Article 48.3)
  7. Independent expert or auditor: non-EU stakeholders could be required to provide information to DSCs (Article 51.1) or the Commission (Article 67) which would then allow them to submit written comments to the DSC (Article 51.3) or the Commission (Article 72) in relation to the case at hands.

Providing evidence to regulators

One particularly interesting option for non-EU based organisations is the ability to provide evidence through formal (Article 53 DSA) or informal ways related to risk assessment and mitigation obligations of VLOPs.

Non-EU stakeholders can intervene ex-ante, by building up evidence and cases to incentivize VLOPs and regulators to pay attention to particular risks that should be covered in a risk assessments. They could also intervene ex-post, once risk assessments are published, to show regulators that certain risks should have been included in the assessments or that the measures taken have not been enough to mitigate the risks assessed.

Extraterritorial elements related to risks can relate to the origin or the impact of the risks. As long as the risks can be linked to people located in the EU, there is nothing in the text of the DSA which suggests that risks originating from outside of the EU should not be covered in a risk assessment. For instance, an organisation can make the case that a VLOP should assess to what extent an election in a non-EU country, or an organized harassment campaign, can have negative effects on users in the EU, and should spell out how it plans to mitigate such risks. Information from non-EU countries can also be valuable in understanding and assessing the nature of risks faced in the EU.

However, the chance of such risks being included in the assessment will mostly depend on whether or not these risks reach the threshold to be seen as ‘systemic’ risks. Because the link with the EU is thinner in those cases, it can be expected that VLOPs will only include those risks if the severity and probability of the risks is high. Such assessment can consider, for instance, the number of people that are affected or at risk.

Indirect effect

Non-EU users and organisations could benefit indirectly from the implementation of some of the articles in the DSA. For instance, better formulated Terms of Service (art 14), and better ‘explanations of algorithms’ (art 27 or 38) will improve users’ understanding around the world and will shine a light on what’s happening ‘under the hood’ of very large online platforms. The results of the DSA transparency requirements (Article 15, 24 and 42 DSA) are also published for everyone to see.

Concluding remarks

The Digital Services Act is gaining momentum in its enforcement, and non-EU based civil society and research organisations can contribute to this process directly and indirectly. While the DSA probably will have relatively little direct impact on the behavior of platforms outside of the EU, its extraterritorial implications can contribute significantly in understanding and tackling systemic risks on very large platforms.

 

[1] https://ec.europa.eu/commission/presscorner/detail/en/QANDA_20_2348