The General Approach of the Council on the Digital Services Act

Ilaria Buri and Joris van Hoboken


On last 25 November, the Council (Competitiveness) meeting approved its negotiating position (general approach text) on the DSA. Far from turning the Commission’s proposal upside down, the Council’s amendments clarify and expand certain (limited) aspects of the proposal, but do not fundamentally alter the content and structure of the original draft provisions. The most significant changes, as explained below, relate to competences under the supervision and enforcement chapter, where the European Commission is granted exclusive oversight powers vis à vis dominant platforms. Search engines are included in the Council’s general approach to the DSA with new specific references in the caching safe harbor provision and as potential Very Large Online Search Engines (VLOSE) in the dominant platform category.

The present blogpost provides a brief overview of the most salient amendments in the Council’s DSA general approach text. As the European Parliament IMCO Committee is scheduled to adopt its final report on the DSA by mid-December, the positions of the co-legislators will soon be negotiated upon in the trialogue meetings scheduled to take place in 2022, after the plenary vote in the European Parliament planned for early 2022.

What is absent in the Council’s text?

The Council’s compromise text stays clear of intervening significantly on a number of aspects which emerged as most hotly debated in the almost three-thousand amendments presented at the level of the European Parliament and in the wider debate. For instance, on the question of online ads (and of possible new restrictions on tracking-based ads) the Council’s text leaves unvaried the Commission proposal on (mere) transparency standards, adding in a new recital (50a) that “common and legitimate advertising practices that are in compliance with Union law should not in themselves be regarded as constituting dark patterns”.

The Council’s provision on systemic risk assessment, identical to the Commission’s one, is also less ambitious than the one presented in the IMCO Committee draft report, as it leaves untouched the limited scope of article 26, which is the cornerstone of the oversight mechanism for big tech and the systemic risks connected to their services. Related, the Council’s provision on audits does not include any amendment to make sure that auditors are truly independent from platforms, while the rules on access to data for researchers (discussed below) may be too limiting and vague to deliver effective scrutiny. These are examples of choices which contribute to weakening the systemic risk mitigation architecture, and with that the idea of achieving meaningful and effective levels of VLOPs’ accountability.

Contrary to the IMCO’s draft report, which introduced strict short deadlines for the removal of different categories of illegal content, the Council’s proposal confirms the original EC’s approach which refrained from introducing timeframes for removal. This policy choice might reflect concerns by some Member States about the excessive impact of such deadlines on freedom of expression, a principle stated by the French Constitutional Council back in 2020 when determining the unconstitutionality of the “Avia law”.

The Council text keeps the focus of the proposal on combating illegal content, in accordance with the principle that “what is illegal offline should also be illegal online”. The moderation of harmful ‒ but legal ‒ content is left to the terms and conditions of providers of intermediary services, which are subject to risk assessments for VLOPs.

Online search engines and online marketplaces

The Council text now explicitly clarifies that the DSA applies to online search engines, which become a stand-alone category of intermediary services. Specifically, the liability exemption is granted to online search engines under the same conditions provided for caching services under art. 4. In addition, VLOSE with more than 45 million active users are subject to the same obligations as VLOPs. The need for clarification of the position of search engines in EU intermediary liability law has been discussed since more than a decade. The choice for a caching safe harbor is notable, and raises less issues from a freedom of expression perspective than imposing the same standards as for hosting companies. Their inclusion in the chapter for dominant platforms, with risk management, special transparency and data access obligations, is a choice for asymmetric regulation in the search engine area.

The Council’s draft also introduces a new section on provisions applicable to providers of online marketplaces, which includes the rules on traceability of traders (Article 22 in the original EC proposal) and two new provisions imposing additional obligations on the marketplaces. The new Article 24b on “compliance by design” requires providers to design and organize their interfaces in a way that facilitates compliance with pre-contractual and product safety obligations, and to avoid design choices that might adversely affect users’ autonomy and decisions. When becoming aware of the presence of an illegal product or service on their platforms, online marketplaces must inform the users who had bought such items, providing them with the identity of the trader and applicable means of redress (new Article 24c).

Country-of-origin principle and (more centralized) enforcement chapter

Over the past months, the country-of-origin principle emerged as one of the most politically contentious aspects of the DSA negotiations among Council members. France, in particular, called for a revision of this principle and advocated that the authorities of the country of destination should be granted specific powers of intervention and enforcement. The French initiative, backed by other large Member States, triggered the reaction of a coalition of other ten countries, led by Ireland, in defense of the country-of-origin principle as set out under the e-Commerce directive. Around the time of the European Council meeting of October 2021, a group of countries, led by France, started supporting the idea of a centralized enforcement towards VLOPs, with exclusive powers attributed to the Commission.

The Council’s general approach on the DSA reflects these recent political developments. On the one hand, the Council’s proposal preserves the country-of-origin principle, thus attributing exclusive powers to the Digital Services Coordinator (DSC) of establishment for the supervision of intermediaries who do not qualify as VLOPs or VLOSE. On the other hand, the Council’s text now grants the Commission with exclusive powers (Article 44a) for the supervision and enforcement of VLOPS and VLOSE with regard to the obligations established under Section IV of Chapter III, i.e. the ones relating to the management of systemic risks, and special transparency and data access obligations (Article 25 through 33 of the original EC proposal).

When it comes to obligations other than those set out under Section IV of Chapter III – such as the ones on notice and takedown, terms and conditions and trusted flaggers – the Commission and the Member State of establishment will have shared powers for the supervision and enforcement vis à vis VLOPs and VLOSE. Specifically, the Member State of establishment will be competent to exercise its supervision and enforcement powers “to the extent that the Commission has not initiated a proceeding in relation to an alleged infringement of the same obligation”. The Council’s text also requires the Commission and the DSCs to provide each other mutual assistance, particularly as regards exchange of information. Related to this, the DSC of establishment will be obliged to inform all DSCs of destination, the Board and the Commission of its intention to open an investigation and take a final decision on a specific intermediary (Article 44b).

Joint investigations

The Council’s version also expands the Commission’s proposal on joint investigations (Article 46). Specifically, the DSC of establishment can launch joint investigations at its own initiative or upon recommendation of the Board, acting on the request of at least three DSCs, to investigate the alleged infringement of the Regulation in the Member State concerned, with the participation of the competent DSCs. In any case, any DSC can request to be admitted in a joint investigation, when they can prove a legitimate interest. All the views of the participating DSCs must be taken into account in the preliminary position eventually issued by the DSC of establishment, and in case of a substantial disagreement with such preliminary opinion, the matter can be referred to the Commission. Moreover, the DSC of destination joining the investigation will be able to exert its investigative powers with regard to the information and intermediaries’ premises located on its territory.

Enforcement and cooperation in investigations

DSCs suspecting that a VLOP or VLOSE infringed the provisions on systemic risks (Chapter III Section 4) or systemically any of the provisions of the Regulation, may invite the Commission to examine the question by sending a request through the information sharing system of Art. 67 (Article 50). The Commission, however, is not obliged to start a proceeding, as the opening of an investigation against a suspected VLOP or VLOSE rests upon its discretion (Article 51). Following the conclusion of an investigation, the Board must provide its opinion on the Commission’s preliminary findings, which must be taken “into utmost account” by the Commission in its final decision.

Enhanced supervision

Moreover, the Council’s text introduces a system of enhanced supervision for the remedies aimed to address infringements of obligations concerning the identification and mitigation of systemic risks (Article 59a). Where the Commission has adopted a non-compliance decision, including with the imposition of fines, the VLOP or VLOSE concerned is required to draw up an action plan to terminate or remedy the infringement. Again, the Commission must take “into utmost account” the opinion of the Board as to whether the action plan is appropriate to reach its purposes. A new Article in the Council’s text also grants the European Court of Justice, in accordance with Article 261 TFUE, with jurisdiction to review the Commission’s decision on fines or periodic penalties.

Final considerations on the DSA enforcement chapter

The DSA approach on enforcement vis à vis VLOPs, both in the initial proposal and the Council compromise text, appears informed by the (far less than optimal) experience of the EU with the enforcement of GDPR, where the country-of-origin principle enabled the current situation of stalemate at the level of the Irish data protection authority.

The Council’s choice of the Commission as the sole regulator for VLOPs (and VLOSE), at least on the most complex provisions, i.e. systemic risk mitigation, has the potential to address some of the potentially problematic consequences of the country-of-origin principle for effective enforcement. At the same time, it cannot be ignored that the Commission is not by itself an independent regulatory authority. From this perspective, the Council’s compromise choice raises issues of regulatory legitimacy and potential interference with fundamental rights, which do not seem to have been explored yet at length in the public debate.

It has been pointed out that the matters regulated by the DSA are closely related to the areas of data protection and media law, two domains where independent regulators are the norm. From this perspective, establishing an  independent EU-regulator to act vis à vis VLOPs on matters covered by the DSA would seem like a more appropriate solution. Where the Council’s enforcement framework is going to gain momentum in the coming negotiations, additional questions that need to be explored and clarified include how, practically, the Commission is planning to take on this role (how many officials will be assigned to the oversight function?) and how it will interact with existing regulators (on matters related to the DSA, such as data protection, (audiovisual) media, competition, telecommunications and consumer law) at the national and EU level.

Mitigation of systemic risks

The EC’s provision on systemic risk assessment (Article 26), which is the cornerstone of the systemic risk management approach devised by the Commission for VLOPs, is left untouched in the Council’s compromise text. However, Article 27 on risk mitigation now includes three new possible mitigation measures. These may consist in adapting content moderation processes (particularly for illegal hate speech), adopting awareness-raising measures and implementing targeted measures to protect children (such as age verification, parental control tools and tools for minors to flag abuse or request help).

Furthermore, under the Council’s Article 32, VLOPs are required to establish a compliance function, independent from the operational functions and composed of one or more compliance officers (including its head of compliance). The compliance function, in particular, must ensure proper identification of the systemic risks and the adoption of the necessary mitigating measures. Strategies relating to such risks must be reviewed at least yearly by the management body, which oversees and is accountable for the governance decisions concerning the independence of the compliance function.

Data access and scrutiny

The Council’s Article 31 introduces amendments to the rules concerning conditions for access to data by vetted researchers, in a way that makes such access and scrutiny regime overall less favorable to researchers compared to the original proposal.

First, the access regime under the Council’s Article 31 continues to exclude journalists and NGOs, the inclusion of which (as potential vetted researchers) has strongly been advocated for over the past months. On the one hand, access to data is no longer limited (as in the EC’s text) to academic researchers, but is extended to any researcher affiliated to a research organization as defined under the copyright directive. However, as the primary goal of the relevant research institutions must be research or education, it is clear that the scope of the Council provision, as to the subjects which are to be deemed eligible for access, does not diverge much from that of the EC. The day after the approval of the Council’s text, a group of international academics, NGOs and think thanks, sent an open letter to the members of the EP IMCO Committee, expressing their concerns about the DSA’s shortcomings in creating the conditions for effective scrutiny of the systemic risks connected to the operations and systems of big platforms and for their accountability. As Article 31 is an important innovation in the DSA’s oversight structure, the letter urges MEPs to preserve the IMCO’s amendment (from the draft opinion of last May) which extends data access and scrutiny of VLOPs to vetted public interest civil society organisations and journalists. Moreover, the signatories invite MEPs to remove the broad trade secrets exemption in Art. 31(6)b, which currently constitutes a legitimate ground for platforms to refuse data access to researchers. A similar warning has recently been raised by Facebook’s whistleblower Frances Haugen, who called upon the co-legislators to reconsider such a broad (and dangerous) exemption to transparency and accountability.

Moreover, while under the EC proposal VLOPs would be obliged to provide access to data to vetted researchers “upon a reasoned request from the DSC of establishment or the Commission”, in the Council’s text, the final decision to award researchers the status of “vetted researchers” lies uniquely within the competence of the DSC of establishment. It can be expected that this would result in the future Irish DSC to be overloaded with such applications. These applications might be pending for a long time, as the DSA draft does not include any deadline for the regulator to react to applications.

In order to be vetted and gain access, researchers must “demonstrate” that they meet all of the conditions listed under Article 31(4). While the addition of the wording “demonstrate” suggests that a higher (and unspecified) burden of proof is imposed on researchers, the Council’s text also expands and further details the conditions that researchers must prove to meet. In particular, the application must detail the technical and organizational measures adopted to ensure the security and confidentiality of the data. Researchers must also justify “the necessity and proportionality for the purposes of their research of the data requested, and they must “demonstrate” that the expected results of the research will contribute to the analysis of systemic risks and mitigation measures. Access to a vetted researcher can be terminated where the DSC that awarded the status decides – after an investigation conducted on its own initiative or based on information from third parties – that the vetting requirements are no longer met. Vetted researchers are in this case allowed to “react” to the findings of the investigation, but the text does not seem to grant them with a real right of defense.

Crucially, in addition to the above, the rules on the access regime for researchers retain a very wide exception for the protection of trade secrets, which can be invoked by platforms as a valid reason to oppose an access request and demand its modification. As observed above, access to data and scrutiny is crucial to the effectiveness of the risk assessment and mitigation mechanism set out by the DSA, and is an essential complement to the audit mechanism under Article 28, particularly when researchers’ findings diverge or add to the ones of the auditors. A research access regime like the one envisaged under the Council compromise text is unlikely to enable meaningful levels of effective scrutiny. This, in combination with provisions on risk assessment and audit that leave the EC proposal unchanged – with a limited scope for the identification of relevant systemic risks, and failing to ensure that auditors are truly independent – gives the impression that the Council’s ambition to tackle systemic risks is relatively limited.

Orders to act against illegal content ; notification of suspicion of criminal offences

Under the Council’s version of Article 8 and 9, intermediaries are required to inform the user “of the order received and the effect given to it”, including at least the statement of reasons and redress possibilities (and territorial scope of the order under Article 8). The text also specifies that the authorities issuing the orders must transmit to the competent DSC the order and the follow-up information from the intermediaries.

As to the obligation to notify suspicion of criminal offences, limited to online platforms under the original EC proposal (Article 21), the Council’s text now extends it to all hosting services. However, no clarifications is added as to which criminal offences “involving a threat to the life or safety of a person or persons” give rise to the notification obligation.

Internal complaint handling system and out-of-court dispute settlement

The compromise text extends to the individuals and entities submitting notices the possibility to challenge a content moderation decision before the online platforms’ internal complaint-handling system and before an out-of-court dispute settlement body. Decisions which can affect users, and can therefore be challenged under the DSA rules, now include under the Council compromise text, “any restrictions of the visibility of specific items of information” provided by the users (Article 15 on statement of reason). Moreover, the text clarifies that platforms’ decisions under Article 17(1) can always be directly challenged before a court in accordance with the applicable laws.


The Council’s amendments to the DSA proposal do not revolutionize the original Commission’s text. The provisions commented above elaborate on or re-structure some aspects of the original Commission’s proposal, but do not put that initial Commissions’ draft into question. This may reflect both an agreement with the approach of the European Commission as well as a strong political will to adopt the DSA relatively quickly in view of the opportunity for the EU to set an international standard with its new approach to platform regulation and content moderation.

The IMCO’s final compromise text, which is scheduled for adoption in the coming weeks, will offer some hints on the possible direction of the negotiations on some of the most crucial issues covered by the DSA, including on its enforcement and oversight structure. However, the IMCO text will be subject to a plenary vote in early 2022, and amendments other than the compromise ones to be adopted in the IMCO report might be tabled again. What can already be said, in any case, is that the trialogue negotiations will see the Council and the Commission particularly close in their positions, a circumstance which might make it more difficult for the Parliament to get some of its (possibly) more ambitious ideas to the final DSA text.


Ilaria Buri is a research fellow at the Institute for Information Law, University of Amsterdam, where her work focuses on the “DSA Observatory” project. She previously worked as a researcher at the KU Leuven Centre for IT and IP Law (CiTiP) on matters of data protection and cybersecurity. She is admitted to the Bar in Italy and, prior to joining academia, she gained extensive experience as a practitioner in law firms and worked at the European Commission (DG Climate Action).

Joris van Hoboken is a Professor of Law at the Vrije Universiteit Brussels and an Associate Professor at the Institute for Information Law, University of Amsterdam. He works on the intersection of fundamental rights protection (privacy, freedom of expression, non-discrimination) and the regulation of platforms and internet services. At the VUB, he is appointed to the Chair ‘Fundamental Rights and Digital Transformation’, established at the Interdisciplinary Research Group on Law Science Technology & Society, with the support of Microsoft.