DSA Audits: What Are They Good For?

By John Albert, DSA Observatory (University of Amsterdam)

Of the nineteen service providers initially designated as VLOPSEs under the DSA, X’s first compliance audit stands apart. Its auditor, FTI Consulting, broke from industry peers by offering relatively critical opinions — including findings that were unfavourable to the platform on obligations under active Commission investigation. How did X respond? Rather than work toward implementing the auditor’s recommendations, X simply reshuffled the deck: it went out and hired a new auditor (BDO). The move raises a deeper question about what the DSA audit regime is actually doing — and how seriously the Commission treats audits as part of systemic-risk enforcement, which, in principle, relies on auditors to provide an additional, independent layer of scrutiny. 

In late November 2025, the second full round of systemic risk assessments, audit reports, and audit-implementation reports were published by Very Large Online Platforms and Search Engines (VLOPSEs) under the Digital Services Act (DSA).  

For DSA watchers, these reports offer a vast amount of new material to sift through, and the first real opportunity to see whether risk assessments have matured year-over-year, if at all, in line with formal audit recommendations or those put forward by civil society. 

Any detailed evaluation will take time. But even a cursory look shows that of the nineteen VLOPSEs that have now published two rounds of reports, only two — AliExpress and X — changed auditors. Everyone else opted for continuity. 

What to make of this? X’s choice is particularly noteworthy. Its initial auditor, FTI Consulting, authored a relatively critical audit report in the first cycle, even issuing negative opinions in areas under active investigation by the European Commission, including systemic risk management obligations under Articles 34 and 35. This broke from a norm that auditors EY, Deloitte, and KPMG — collectively responsible for the bulk of early DSA audits — were clearly trying to establish under the auspices of their industry coalition, the European Contact Group (ECG) 

For context: under Article 37 of the DSA, independent audits are intended to support the Commission’s supervisory oversight of VLOPSEs by providing an external assessment of platforms’ compliance practices. While the Commission’s enforcement decisions — including the imposition of fines, like the recent one against X — rest on its own legal findings, audits can nonetheless contribute to the broader factual context in which compliance is assessed. 

Now, by replacing FTI in the second audit cycle, X appears to have tried hitting the reset button on an already fragile accountability mechanism. Its new auditor, BDO, is party to the ECG and followed the industry group’s guidance, withholding judgments on obligations where the Commission was investigating X for possible infringements.  

This development is a stark example of the procedural ambiguities underpinning the audit regime — a framework that is both seriously constrained by Big Four auditor logics and significantly shaped by the very platforms it is meant to scrutinize.  

What accountability value, then, can these audits realistically deliver when it comes to assessing platforms’ novel systemic risk management obligations under the DSA?  

What auditors actually established in the first cycle — and what they didn’t 

To contextualize X’s auditor switch, it makes sense to look back at how the first cycle of DSA audits played out. The first cycle revealed an audit culture heavily influenced by financial and ESG auditing norms practiced by so-called Big Four consultancy firms, of which three (EY, Deloitte, and KPMG) were responsible for 17 of the first 19 DSA audits.  

While the audit reports cover a wide range of issues that warrant deeper analysis (examples here and here), this section focuses on VLOPSEs’ risk management duties under Articles 34 and 35. These provisions sit at the heart of the DSA’s risk-based accountability framework, yet are among its most open-ended — creating considerable ambiguity around how compliance should be evaluated, particularly in these early stages of DSA implementation.  

In reviewing all nineteen first-cycle audit reports, it becomes clear that auditors largely dealt with this ambiguity by confining their assessments to procedural checks. They assessed whether platforms had indeed documented formal risk-assessment processes, whether identified risks were linked to stated mitigations, and whether internal policies and paperwork broadly aligned with the DSA’s formal requirements.

2024 audit reports: summary of audit opinions for articles 34 and 35

The table above summarizes the first cycle of audit opinions under Articles 34 and 35, showing a majority of “positive” and “positive with comments” findings, with only one provider—X—receiving any negative opinions. There were also four disclaimers of opinion issued for providers under active investigation by the European Commission for potential infringement of systemic risk obligations.  

Some variation in these conclusions reflects the nature of the provisions themselves. Article 34(3), for instance, is more narrow in scope, requiring platforms to preserve supporting documents of their systemic risk assessments and make them available to regulators upon request. This article drew consistently positive results, as it is relatively easy to audit — although X’s auditor also diverged here, issuing “no conclusion” on the basis that they were unable to verify that the necessary documents were indeed retained (or even generated and indexed to begin with). 

By contrast, Articles 34(1), 34(2), and 35(1) are far more open-ended, asking platforms to identify, assess, and mitigate systemic risks to fundamental rights, civic discourse, and public health, among others. This is a highly novel set of obligations with no settled benchmarks, effectively giving platforms broad discretion over how to establish and document their risk management procedures in accordance with the law.  

Auditors’ positive conclusions on Articles 34 and 35 

Looking at the first audit cycle, auditors were mostly quite generous when evaluating their clients’ compliance with systemic risk obligations. While some auditors did hint that platforms’ risk assessments were too ad hoc, or that they failed to demonstrate a coherent risk management framework, most nevertheless delivered “positive” or “positive with comments” findings.  

Where auditors did offer recommendations for improvement (under “positive with comments” conclusions), these were generally vague and procedural in nature — asking for clearer policies, documentation, or evidence of more formalized risk management processes.  

Take, for example, the requirement for platforms to not only carry out annual risk assessments under Article 34, but also to conduct one before deploying any functionality likely to have a “critical impact” on identified risks (TikTok’s withdrawal of its “TikTok Lite Rewards” programme from the EU shows what can happen when platforms overlook this procedural step).  

Speaking to this requirement, at least three first-year audits — AliExpress (KPMG),  Booking.com (Deloitte) and Google Search (EY) — recommended adopting clearer internal policies for when to trigger such an off-cycle risk assessment.  Rather than suggest that an off-cycle risk assessment should have occurred prior to launching a particular feature (which would require proposing a substantive threshold for “critical impact” to identified risks), auditors recommended clarifying policies and instituting trainings to guide personnel in knowing when to trigger them. 

In practice, this approach leaves platforms largely responsible for defining the standards against which their own risk-management practices will be judged. It also speaks to how auditors understood their role within the structural constraints of the first year. In the absence of meaningful benchmarks, auditors were tasked with either making subjective judgements, or cautiously deferring judgement to the extent possible. Perhaps unsurprisingly, most chose caution.  

The ECG line: don’t opine if the Commission is investigating 

This cautious approach was reinforced by the European Contact Group (ECG), the industry coalition to which EY, Deloitte, KPMG, and X’s newest auditor, BDO, belong. The ECG issued guidance in a position paper: where the European Commission is already investigating a platform on a given obligation, auditors should, in many cases, refrain from issuing a conclusion on that same obligation. In the first audit cycle, this approach led to four platforms — AliExpress, Facebook, Instagram, and TikTok — receiving “Disclaimers of Opinion” for Articles 34 and 35.   

The stated justification for such disclaimers is that the Commission may hold relevant evidence the auditor has not seen, and that this information could be of material importance to the auditor’s assessment. The implicit rationale, however, is the perceived liability risk. Auditors worry that issuing a clear opinion for obligations under investigation could later clash with the Commission’s findings and expose them to professional or legal challenge.  

With this practice of disclaimers, auditors have arguably stepped back from their responsibility exactly where an additional layer of scrutiny is most needed. Daniel Holznagel has argued that the legal basis for such disclaimers of opinion is doubtful; even so, the Commission has yet to formally address the issue, apparently giving auditors a free pass to withhold judgement in such cases, at least for now.    

FTI’s X audit: An outlier in the first cycle 

Unlike its peers, FTI ignored the ECG line and still issued conclusions on obligations under investigation. The Commission had already opened formal proceedings against X in December 2023, in which the Commission cited possible violations of, among others, X’s systemic risk management obligations.  

FTI not only issued conclusions despite active proceedings by the Commission; it was the only auditor to provide negative conclusions on risk management obligations, taking X’s shortcomings as evidence of noncompliance under articles 34(2) and 35(1).  

In its negative judgments, FTI described a series of procedural failures by X. The audit flagged basic gaps in how X assessed the influence of its recommender systems, content moderation, advertising practices and data use on systemic risks — areas the DSA requires platforms to analyse explicitly (in one instance, X even claimed that a separate assessment of data-related risks existed, but failed to produce it for the auditor).  

X’s “Freedom of Speech, Not Reach” policy was also cited as problematic. X treated it solely as a mitigation measure, even though the policy (essentially allowing more borderline content to stay up while limiting its amplification) may equally be regarded as a risk driver. FTI expected X to assess both sides of that trade-off, which the company did not do. 

How did X respond? In its first audit implementation report, X simply ignored FTI’s recommendations and refused to implement them. As a justification, X cited the Commission’s active investigation, ongoing regulatory uncertainty regarding the scope and interpretation of systemic risk management obligations, and the risk that FTI’s findings could ultimately conflict with the Commission’s decision.  

It’s indeed possible that the Commission could end up interpreting X’s systemic risk obligations somewhat differently than FTI did. But this seems like a weak excuse for X to disregard the auditor’s findings altogether.  

And then X changed auditors 

Which brings us to the second-round reports. After declining to implement FTI’s recommendations on Articles 34 and 35, X simply replaced them with an auditor they could expect to withhold its opinions: BDO 

Granted, the DSA does not require auditor continuity, and other EU audit regimes even mandate periodic auditor rotation to guard against independence risks. But X’s switch is conspicuous, given that BDO is a member of the European Contact Group and predictably followed the ECG guidance. Where the Commission is investigating X for possible breaches, it did not issue a conclusion. 

Whatever FTI revealed in the first cycle is now frozen in time, supplanted by a second-round audit that tells us what we already knew — that the audit mechanism is apparently broken for any obligation under active investigation. And if that logic were to extend through subsequent appeals (a process that could take years), it could effectively nullify the instrument precisely in the cases where it is most needed. 

This raises clear questions about how the audit framework operates in practice. Do FTI’s negative findings still inform the EC’s investigation into X? Does the Commission accept that where it has initiated enforcement proceedings, auditors won’t issue opinions? And if an inconvenient auditor can simply be replaced by a more deferential one, has the DSA already fallen victim to audit capture? 

X’s auditor switch certainly suggests an audit ecosystem under Article 37 that, at least in its early days, is overly cautious and beholden to the platforms it is designed to scrutinize. Whether audits become a meaningful source of regulatory insight, or just another routinised compliance product, will depend on how the Commission interprets moments like this one. 

Zooming out on systemic risk enforcement  

Since the DSA came into force, many have expected X to be at the center of the DSA’s first big enforcement moment. That moment finally came on 5 December 2025, when the Commission issued a €120 million fine against X for breaching transparency obligations that had been under formal investigation since 2023. But the Commission has yet to come down on X for possible infringements of its systemic risk management obligations, which remain under investigation.  

As my DSA Observatory colleagues Magdalena Jóźwiak and Joris van Hoboken have recently written about and discussed, the politics of systemic risk enforcement are tricky, to put it mildly. Despite many active investigations, enforcement remains slow and cautious given the complexity of the framework and the clear expectation of blowback from some US tech companies and the Trump administration.  

Even a sanction that penalizes a platform on more basic procedural grounds, as the recent X fine does, risks being turned into ammunition by tech oligarchs and Trump officials as part of a “censorship grievance fantasy” that underpins both recent US foreign policy decisions and parts of its official new national security strategy 

In this political climate, one can assume that the Commission will give platforms some time to figure out how to demonstrate more robust systemic risk management processes. For their part, enforcement teams continue to carry out investigations, develop risk mitigation guidelines, and document best practices to help guide companies in their compliance journeys —  while holding open the possibility of escalating enforcement actions that will, no doubt, be highly contested when they come.  

What are DSA audits good for?  

Given the broader legal and political context surrounding systemic risks, what value are DSA audits adding to the equation? As X’s auditor switch exemplifies, not very much — at least, not currently. Several researchers have already produced valuable insights and critiques of the first audit cycle, exposing gaps in the reports and suggesting improvements for subsequent rounds. But as these analyses suggest, the initial bar has been set quite low, even for audits of transparency obligations that are far less layered than those related to systemic risk management.  

One consistent research finding: it was hard to parse auditors’ methodologies and the evidence they relied upon, given how vague these disclosures generally are. Such opacity should make observers wary, coming from an audit industry with a long track record of scandal. Just last month, X’s new auditor BDO was itself fined £6.5m by the UK regulator for faking audit evidence over multiple years. The case has nothing to do with X or the DSA, but points to the systemic vulnerabilities within the audit industry that the DSA regime is now leaning on. 

This doesn’t necessarily mean DSA audits are without value. Even in these early stages, they can provide some baseline of visibility into platforms’ compliance practices. But by reducing novel and socio-technical challenges to a checklist of documentation and internal controls, even a more critical audit, like FTI’s in the first cycle, risks entrenching procedural metrics at the expense of substantive accountability.  

As it stands, the audit mechanism is spotty. It appears too easy to game, too reluctant to offer sharp judgement, and may soon grow too dependent on the very companies it is meant to help scrutinize. Until the Commission clarifies expectations for audits of Articles 34 and 35, audits will likely continue to produce very little value when it comes to systemic risks — precisely the domain where independent scrutiny was expected to matter most.