The DSA’s most demanding rules are directed at the largest platforms: Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). This blog post offers a quick overview on two of the most important obligations for these large services, with more than 45 million monthly average active users: risk management, and independent auditing. In particular, it discusses what kind of information the public can expect to receive about these processes, and the timeline for its release.
What can we expect from risk assessments and audits?
Risk management is the cornerstone of the DSA’s ruleset for large services. VLOPs and VLOSEs are required to conduct periodical assessments for so-called ‘systemic risks’ (Article 34) and then take appropriate measures to mitigate these risks (Article 35). The concept of ‘systemic risks’ is a novel and broad category, extending from fundamental rights and the spread of unlawful content to the protection of civic discourse and public health. The DSA also offers a long list of factors to take into account when assessing risks (e.g. recommender systems, content moderation systems, advertising systems, and ‘data related practices’) and an even longer list of possible mitigation measures (e.g. adapting UX design, Terms and Conditions, content moderation processes, recommender algorithms, awareness raising, etc. etc.).
Due to its broad scope, the actual substantive norms following from risk management remain difficult to predict. They will also vary between different services, depending on the specific risks they face. The framework is already fueling debate as to appropriate methods. In any case, its broad scope suggests that the risk management framework can be a powerful lever for the European Commission to oversee platform policy. And precisely because risk management is so open-ended, we need transparency to keep track of its outcomes – more on this below.
As for independent audits, these are supposed to verify compliance with the risk management framework and other duties under the DSA. The audits are to be carried out by independent third parties, who will gain privileged and far-reaching access to the service’s data in order to verify and underwrite the claims they make in the risk management process. On this basis, they draw up an audit report, including an opinion either ‘positive’, ‘positive with comments’ or ‘negative’ as to the service’s compliance.
Whether the risk management and auditing will do much good, remains up for debate. Certainly it will create a lot of busywork for platforms and auditors. Criticism from scholars like Alessandro Mantelero has highlighted how unspecific the underlying norms are, and how much interpretive leeway they leave for platforms in terms of assessment methods. And since the system revolves around self-assessment, it remains to be seen whether auditors will have the capacity and incentives to meaningfully scrutinise platform findings – much less whether the Commission and the public will be at able to assess their performance. More on this below.
What can the public expect to see?
There are two main sources of information about risk management and auditing: bi-annual reports from the VLOPs and VLOSEs themselves, and annual comprehensive reports from the European Board for Digital Services.
VLOPs and VLOSEs are required to publish a report for each risk assessment and audit they conduct. This report must contain the underlying audit report, and also set out the results of the platform’s own risk assessment and mitigation processes. Just how much insight these will offer is unclear, however, since platforms can choose to redact certain information from this public report, if it “might result in the disclosure of confidential information of that provider or of the recipients of the service, cause significant vulnerabilities for the security of its service, [sic] undermine public security or harm recipients” (Article 42(5)). Only the Commission will then have access to the full, unredacted version.
Article 42(5) DSA: Public reporting or public redacting?
In light of these rather broad carveouts, it remains to be seen how useful these public reports will be in practice. Especially the first clause, which exempts ‘confidential information’ is problematic. Isn’t the very purpose of transparency rules to gain access to information that would otherwise be confidential? If it were not confidential – i.e. already a matter of public record – then why would it need to be reported in the first place? So if this exception is overinterpreted, it could give platforms the opportunity to scrub their public reports of much if not all valuable information. Hopefully the Commission will be strict in enforcing these carveouts, and, to this end, develop a more demanding and normative interpretation as to what can and should be ‘confidential information’.
Another important source of information about risk management will be a yearly report published by the Digital Services Board, in cooperation with the European Commission. This report must document the “most prominent and recurrent” systemic risks reported by VLOPs and VLOSEs, and best practices to mitigate these risks (Article 35(2) DSA). The Commission can also issue its own guidelines recommending certain best practices for risk mitigation (Article 35(3) – offering another glimpse into the risk management process. In addition to these public reporting rules, also relevant are the DSA’s data access rules for vetted researchers (Article 40), which is intended to enable research related to systemic risks… but this is another beast entirely, and in this blog post we’ll limit our discussion to proactive public reporting duties.
When can we expect risk assessments and audits?
For risk assessments, the timer starts running when the European Commission first designates a service as a VLOP or VLOSE (Article 33). The rules for VLOPs/VLOSEs become applicable four months after designation (Article 33(6), and the first risk assessment must be completed within those four months as well (Article 34(1)). After those four months, once the first risk assessment is completed, the service has another year to conduct its first independent audit. Finally, the first public report must be issued within three months after the audit.
In sum, it’s the Commission’s designation decision which kicks everything into motion: if you know the date of designation, then all the other deadlines follow. So when might the Commission start designating VLOPs and VLOSEs? The earliest possible date permitted by the DSA is 17 February 2023. So if the Commission moves quickly, we could expect the first risk assessments by 17 June 2023, the first audits by 17 June 2024, and the first audit reports by 17 September 2024. Next month we’ll know more: the longer it takes the Commission to designate VLOPS and VLOSEs, counting from 17 February 2023, the longer we’ll have to wait for these other milestones too.
|First Designations||First risk assessment||First independent audits||First public report
|Rule||No earlier than 17 February 2023||Four months after designation||Sixteen months after designation
|Three months after audit
As to the Board’s comprehensive report on systemic risks, there is no specific deadline except that they must appear ‘once a year’ (Article 35(2) DSA). However, the Board can only start working on such a report once all DSCs are designated by the Member States in February 2024. This means that the Board’s first report could arrive as late as February 2025. A similar timeline applies for the first researcher data access applications, which can only proceed once the competent DSC is established.
In sum, it could be another while before any big news drops on systemic risk management under the DSA. For those keeping count, all eyes will be on the Commission from 17 February onwards to see how quickly they move on designating VLOPs and VLOSEs; the official kick-off for everything to come.