DSA Audits: Procedural Rules Leave Some Uncertainties

Anna Morandini

(PhD candidate at the European University Institute)

 

Auditors hold significant power in reviewing the DSA compliance of very large online platforms and search engines (‘VLOPs’ and ‘VLOSEs’). Their role includes evaluating platform policies on systemic risk management. In October, the Commission adopted the delegated regulation on the performance of DSA audits (‘DRA’) to specify audit procedures. Unless Parliament or Council raise objections, it will enter into force by 20 January and apply to the first round of audits due in August 2024.

This post outlines the DRA and discusses its potential to render audits a central platform accountability tool. It first introduces audits under the DSA and the DRA’s role followed by reflections on auditors and auditing criteria. Before discussing auditing methods, it highlights auditors’ strong access to information. Finally, it turns to the audit report and ways to access it. The conclusion points to the long way ahead for DSA audit practice to shape audit criteria and accessibility. To succeed, audits must disprove salient concerns of audit washing that fear a mere procedural exercise without substantive impact.

DSA Audits and the Delegated Regulation

Article 37 DSA foresees yearly audits for VLOPs and VLOSEs, which are services the Commission designates upon reaching at least 45 million average monthly active EU recipients. VLOPs and VLOSEs must commission independent auditors to assess their compliance under the transparency and safety obligations of Chapter III DSA. These audits moreover review compliance with future codes of conduct and crisis protocols established under Articles 45-46 and 48 DSA.

The DRA specifies who can audit, how providers cooperate, and how to conduct the audit. Its annexes establish compulsory templates for audit and audit implementation reports. The Commission held a public consultation on the draft DRA in May 2023, which received 44 comments – significantly less than the simultaneous call for evidence on data access. Yet the broad scope of DSA audits should call our attention to the DRA: auditors hold an influential position in the new EU platform accountability framework.

Who Audits?

The DSA establishes quality requirements for auditors, which the DRA does not further specify. Auditors must be independent from the audited provider; in particular, they must not have rendered audits for more than ten consecutive years, no non-audit services within a year before and after, and their payment must not be conditional on results. They must hold expertise in risk management and technical competence and proven objectivity and professional ethics (particularly by adhering to codes of practice and standards).

While some argue that an independent body could best identify competent auditors, Article 37 DSA relies on VLOPs and VLOSEs to do so. For practice, one question stands out: Which organisations hold the expertise needed to audit very large providers platform accountability and large-scale auditing? In the DRA Explanatory Memorandum, the Commission asks auditors to build the necessary capacities and providers to select auditors accordingly.

The DRA formally opens audits to diverse actors. For example, it refrains from mandating existing accounting standards (see proposals). Many still expect the Big Four to dominate DSA audits (see Wikimedia). Some stakeholders however consider human rights consultancies or law firms as best positioned to assess value choices in algorithmic systems (see TikTok). Opening audits to smaller, non-profit, or open-source driven experts in algorithmic accountability and human rights would require proactive steps (see Mozilla). Many civil society actors instead hope for strong data access and resources for adversarial audits outside the scope of Article 37 DSA (see AlgorithmWatch and AI Forensics, Article 19, and Mozilla).

Defining Audit Criteria

Audit criteria are standards against which auditors assess compliance. Controversially, the DRA leaves their definition to auditors. Its vision is to create a range of suitable approaches by providing ‘sufficient flexibility for the development of good practices’ (DRA, Explanatory memorandum). This approach could encourage auditors to consider a services’ nature and to ‘offer it a fair and specific critique’ (see demands by Wikimedia). Article 44(e) DSA enables the Commission to later counter inadequate or incomparable auditing criteria through harmonisation in promoting voluntary European or international standards. Existing suggestions form a starting point.

Many stakeholders criticised that the DRA tasks auditors with defining audit criteria. Large auditors argued that this is not commonly their role. They fear a lacking comparability of audit reports, a weakening of their independence, and a race to the bottom in audit criteria. They pushed for the Commission under stakeholder participation, an independent body, or intermittently providers to elaborate the criteria. The chosen openness in criteria allows flexibility while increasing the difficulty and weight of auditors’ choices. It may create path-dependencies, with early auditing practice shaping later standards.

Audited Information

Based on Article 5 DRA, providers must give all necessary information and access to auditors – particularly regarding IT systems, data processing, and algorithmic systems. The provision moreover includes decision-making structures, compliance measures taken for each obligation, and benchmarks the provider uses to analyse compliance. Moreover, auditors can demand access to the IT system, testing environments, staff and premises, personal data, information on processes, and documentation. To conduct their analysis and tests, they can ask providers for resources, assistance, and explanations.

This strong access to information, systems, and staff enables a sound audit basis – and significant opportunities for platform accountability. It is only rivalled by Commission access (Article 69(2)(d) DSA); researcher data access is more limited. Tensions might arise from confidentiality interests. For example, Google argues that as a data controller under the GDPR it should determine the necessity and proportionality of disclosing personal data.

Selecting Audit Evidence

The DRA requires auditors to consider information gathered from the provider, Board reports, and Commission guidelines. This particularly comprises all relevant risk assessment, risk mitigation, and transparency reports by the provider. Auditors may further consider information published by vetted researchers and reports by other providers (Article 13 DRA). The DRA did not follow demands to involve civil society organisations more formally by including their analysis as a source of information or mandating a consultation period for interested journalists, NGOs, researchers to submit information that auditors must then consider (see Wagner).

If they rely on sampling, auditors must select information that is representative of the audit period, relevant algorithmic features such as personalisation, and reflects concerns related to minors, minorities, or other groups. More generally, the information auditors use must ensure relevance and reliability (Article 11-12 DRA).

Defining Auditing Methods

While tests of algorithmic systems are not in principle mandatory, audits will likely include them. When selecting their methodology, auditors decide if they are a necessary addition to substantive analytical procedures and the analysis of internal controls. They must conduct tests of algorithms if reasonable doubts on compliance arise from the audit. Such doubts may stem from auditors’ critical judgement, indications from Board reports or Commission guidelines, or events occurring during the audit, e.g., crisis situations. Upon public consultation input, the DRA moreover obliges auditors to determine material error or omissions in providers’ public transparency reports (Article 10 DRA).

Auditors should attain a reasonable level of assurance, the highest auditing standard, and must justify if they cannot (Article 8(8) DRA). They must describe their initial methodology and any changes made in the audit report (Article 10(2) DRA). As Algorithm Audit points out, there are normative choices involved in both designing and assessing algorithmic systems, which audit reports should show.

Methodological Minimum Standards

Specific methodological minimum standards govern the auditing of risk assessments, risk mitigation, crisis response mechanisms, codes of conduct and crisis response protocols, and audits (Article 13-17 DRA). The table below shows them for risk assessments.

 

Analysis of risk assessments must include
Appropriate identification of risks (incl. regional and linguistic aspects)
Method of assessment of each risk (incl. probability and severity)
Identification and weighting of Article 34(2) DSA risk factors (e.g., algorithmic design)
Collection and use of information (incl. scientific and technical insights)
Testing of risk assumptions with specific most impacted groups
Performance within Article 34(1) DSA timeframe
Identification of functionalities with likely critical impact on risks

 

When auditing risk mitigation measures, auditors must assess how providers identified measures and if they appropriately applied those suggested in Article 35(1) DSA. They review if the adopted measures mitigate risks in a reasonable, proportionate, and effective way – with particular regard to fundamental rights. This analysis assesses how the measures were designed and executed and compares assessments before and after their adoption.

Composing Audit Opinions

The overall audit opinion follows from the audit conclusions, in which auditors assess compliance with each of the audited obligations. The audit opinion separates compliance for DSA Chapter III and for each code of conduct or crisis protocol. Both conclusions and opinions can take three forms: positive, positive with comments, or negative. One negative audit conclusion causes a negative audit opinion, as the DRA did not follow calls for more nuanced conclusions or a higher threshold for a negative audit opinion.

 

Positive Positive with comments Negative
Audit conclusions

 

Provider complied Provider complied and auditor

·      comments on provider benchmarks or

·      recommends improvements without substantive effect on conclusions

Provider has not complied
Audit opinion

 

Positive conclusions for all audited obligations/commitments ·      No negative audit conclusion and

·      at least one conclusion with comments

One or more negative conclusion(s)

 

Auditors have to complete the first audit report within a year of the date when the obligation applies to the provider. For the 19 initially designated VLOPs and VLOSEs, this deadline is 25 August 2024. Providers and potential auditors raised difficulties in living up to the complex standards in the first round of audits.

Implementing and Publishing Reports

Within a month of receiving an audit report that is not ‘positive’, providers should outline measures to implement auditor’s recommendations in an audit implementation report. Article 37(6) obliges them to justify and set out any alternative measures, if they disregard a recommendation. This system could create feedback loops between auditors and providers that incentivise continuous improvements. Where the Commission finds that the provider has not complied with the audit’s recommendation, it may adopt non-compliance decisions and impose fines (Chapter IV DSA).

VLOPs and VLOSEs must publish the audit report and audit implementation report within three months of receiving the former from the auditor. The public can therefore expect to read the first reports by December 2024. Providers may however remove information out of concerns of disclosing confidential provider or recipient information, threatening the security of the service or public security, or harming recipients. In this case, they must complement their unredacted reports to Commission and Digital Services Coordinator with a statement of reason (Article 42 DSA). Balancing secrecy interests and transparency demands in interpreting the broad and contested exemption for publication is a central question for practice. A potential path to allow for independent scrutiny of redacted information, is opening it to vetted researchers.

Conclusions: Building Upon Practice

With comprehensive access to information and IT systems, the DRA gives auditors significant power. Yet it only hesitantly fulfils its task of specifying audit procedures, leaving central choices to be taken by VLOPs and VLOSEs in selecting their auditors, by auditors in defining criteria and methods, and by the Commission in ensuring compliance and public scrutiny. Auditors’ choices in reviewing compliance with broad risk management obligations are particularly crucial. This practice will determine if audits become the envisioned strong tool of platform accountability or rather materialise risks of audit washing.

The DRA builds on the vision of creating expertise over time, initially prioritising flexibility. The Commission concluded from consultations that, while stakeholders long for clarity on the auditing process and comparability of results, ‘a too detailed and prescriptive approach to how the audit should be conducted would be premature’ (DRA, Explanatory Memorandum). It aims to establish comparable audit criteria based on auditors’ early practice, relying on its competence to promote voluntary standards of European and international standardisation bodies. To open this process to public input and scrutiny, the Commission should involve civil society and researchers in further discussion on the auditing process – particularly in the initial auditing rounds. Further analysis should track how this approach evolves.