The European Parliament IMCO Committee’s compromise text on the DSA
(Institute for Information Law, IViR, University of Amsterdam)
Disclaimer: Dear reader, please note that this commentary was published before the DSA was finalised and is therefore based on an outdated version of the DSA draft proposal. The DSA’s final text, which can be here, differs in numerous ways including a revised numbering for many of its articles.
On 14 December 2021, the Internal Market and Consumer Protection (IMCO) Committee of the European Parliament adopted its final report on the Digital Services Act (DSA) proposal. The report – together with new additional amendments proposed over the past weeks – will be voted upon by the European Parliament in a plenary session scheduled for 19-20 January 2022.
The IMCO’s compromise text approved on 14 December 2021 is the result of one year of intense debates among political groups, which resulted in the tabling of almost three thousand amendments. The plenary vote this week will unveil the final negotiating position of the European Parliament. It will then be seen, over the next weeks, which innovations of the IMCO Committee’s text will survive the trialogue talks, where the Council’s general approach is closer to the Commission proposal, and where new lines of compromise will be drawn.
This blogpost briefly summarizes the most significant amendments of the IMCO report to the original DSA proposal presented by the European Commission exactly one year earlier. It will soon be updated with new analysis on the final European Parliament’s DSA text.
Chapter on intermediary liability
The IMCO compromise text (Article 7) expressly prevents Member States from limiting the offer of end-to-end encrypted services and the anonymous use of intermediaries’ services. Also, intermediaries cannot be obliged to “generally and indiscriminately” retain users’ personal data, as any specific data retention must be justified by a judicial order.
Significant limitations are introduced to the territorial scope of orders to act against illegal content, issued by judicial or administrative authorities. Such orders will only be effective in the Member State where the order is issued, unless content is illegal under EU law or the rights at issue require a wider protection (Article 8(2)(b)). Furthermore, the EP’s text introduces the right to an effective legal remedy for the users whose content was removed or whose information was required under Articles 8 and 9 respectively. These remedies, to be exercised in front of a judicial authority in the issuing Member State, may include restoration of content erroneously considered illegal by intermediaries (Article 9a).
Due diligence obligations
The IMCO’s version of Article 12 on terms and conditions adds significant further details to the obligations imposed on the intermediaries with regard to their services’ contractual terms. In particular, intermediaries are required to operate “in a fair, transparent, coherent, diligent, timely, non-arbitrary, non- discriminatory and proportionate manner” when applying possible restrictions on users’ content. Moreover, VLOPs are obliged to make available their contractual terms in all the languages of the Member States where their services are offered.
On one of the most heated questions of the entire DSA debate – that of reining in the adtech industry and restricting the surveillance-based business model of platforms, which have been linked the same systemic harms that the DSA aims to tackle – the final IMCO compromise text is far less ambitious than the three European Parliament’s DSA resolutions approved in October 2020. Back then, the EP expressed the intention to introduce stricter restrictions to tracking-based ads, consisting in “a phase out, leading to a prohibition” of this type of ads, as later also advocated for, among others, by the EDPS and the EDPB in their opinions on the DSA.
MEPs and civil society organizations which campaigned for a ban on tracking-based ads did not achieve their aims with this IMCO final report. However, they found support for a new provision (Article 13a) on online interface design and the prohibition of dark patters. Specifically, intermediaries are prevented from using their online interfaces in a way that adversely affects users in making “a free, autonomous and informed decision or choice”. In particular, intermediaries should not give visual prominence to a specific consent option over others and should refrain from persistently asking users for consent to data processing when consent has already been denied and from making the termination of the service more difficult than signing up.
The IMCO’s text introduces some additional procedural rules in the section applicable to hosting services. Information indicated as illegal under the notice and action mechanism must remain online while the assessment of the content is pending. At the same time, the anonymity of the notifier must be in principle guaranteed vis à vis the user who submitted the content (Article 14).
Like the Council’s general approach text, the IMCO’s version of Article 15 on statement of reasons extends the scope of the providers’ decisions on content (demoting and other measures, in addition to removing and disabling access) which give rise to an obligation to inform the users about the decision and its motivation.
Furthermore, the scope of application of the obligation to notify the suspicion of serious criminal offences is extended to all intermediaries and is not anymore limited to online platforms (Article 15(a)).
As to the additional obligations applicable to online platforms, the exclusion from those rules – foreseen for micro and small enterprises in the Commission’s proposal – is extended to intermediaries which qualify as not-for-profit or as medium enterprises upon successful application for a waiver. The Commission can grant a partial or total waiver from the obligations set out under that section where the applicant does not pose “significant systemic risks” and has “limited exposure to illegal content”.
Systemic risks management obligations for VLOPs
The EP’s text introduces significant amendments to the original proposal’s section on additional obligations for VLOPs. As regards the identification of VLOPs, the IMCO’s Article 25 specifies that the calculation of active users must be conducted on each service individually. This has the potential to reduce the number of VLOP services falling under the systemic risk management obligations.
The EP’s IMCO text significantly strengthens the overall systemic risk management structure applicable to VLOPs. The obligation to perform a risk assessment is always required before launching new services and the assessment must consider the design, algorithms and business model choices of specific platform services. Moreover, platforms must take into account language and region-specific risks.
The list of systemic risks which must be assessed is broadened as follows:
- the dissemination of content in breach of platforms’ terms and conditions (in addition to illegal content) is included among thes;
- any “actual and foreseeable” negative effects on fundamental rights become relevant as potential systemic risks, including consumer protection, human dignity, private and family life, personal data protection, freedom of expression and media pluralism, prohibition of discrimination, gender equality and the right of the child. Notably, the EC’s and Council’s texts limited the risk assessment to the potential negative effects on only four specific fundamental rights: private and family life, freedom of expression, prohibition of discrimination and the rights of the child;
- risks which are “inherent to the intended operation of the service” also become relevant to the assessment, including their negative effects on vulnerable groups of users, democratic values, media freedom and freedom of expression;
- another category of systemic risks is added, concerning public health and other negative effects for the “physical, mental, social and financial well-being” of persons.
The IMCO’s text further articulates the requirements under Article 26(2), mandating VLOPs to assess how the abovementioned systemic risks are impacted by their terms and conditions, community standards and the data collection and profiling underlying their advertising systems. “Where appropriate”, VLOPs must conduct the risk assessment and design their risk mitigation measures by involving representatives of users and groups potentially impacted by the service, civil society organizations and independent experts. And possible mitigation measures may also specifically tackle VLOPs’ algorithmic systems, advertising systems and design.
The IMCO report tries to address some of the most striking limits of the original proposal’s independent audit requirements. Auditors must be vetted by the Commission and crucially, they must be legally and financially independent from and not be in a situation of conflict of interest with any VLOP. Furthermore, auditors must not have rendered any service to the VLOP concerned in the past 12 months and are banned from working for that VLOP or a related association for 12 months after leaving the auditing firm. The audit report must also give account of the elements which could be audited and motivate why these could not be covered.
The IMCO’s text takes a step forward on VLOPs’ recommender systems requirements. VLOPs are obligated to offer at least one recommender system option not based on profiling and to allow users to modify their preferences through an easily accessible interface (Article 29).
Access to data under Article 31 is extended, beyond vetted researchers, to vetted not-for-profit organizations representing the public interest. The EP text eliminates the explicit reference to trade secrets as a possible ground for VLOPs to request a modification of the research access request. However, as trade secrets qualify as confidential information, and possible vulnerabilities of confidential information allow VLOPs to withdraw information, VLOPs would still be able to invoke trade secrets as a ground to oppose the request and require its amendment. Moreover, the national regulator (Digital Services Coordinator) and the Commission must publish once per year a report on the number of research access request made to them and on how many were rejected, including as a result of the VLOPs’ demand to amend a request.
Enforcement
As regards enforcement, and particularly enhanced supervision vis à vis VLOPs, the EP’s text makes it mandatory for the Commission – and not discretionary – to start proceedings against dominant platforms suspected of having infringed any provision of the DSA or found to have infringed the provisions on additional obligations applicable to VLOPs (Article 51(1)).
Ilaria Buri is a research fellow at the Institute for Information Law, University of Amsterdam, where her work focuses on the “DSA Observatory” project. She previously worked as a researcher at the KU Leuven Centre for IT and IP Law (CiTiP) on matters of data protection and cybersecurity. She is admitted to the Bar in Italy and, prior to joining academia, she gained extensive experience as a practitioner in law firms and worked at the European Commission (DG Climate Action).