The DSA’s first shadow banning case

by Paddy Leerssen, postdoctoral researcher at the University of Amsterdam

6 August 2024



This post discusses a recent decision from the Amsterdam District Court, in which an end-user of X was awarded damages due to the platform’s undisclosed ‘shadow banning’ of their profile.

 


 

An early victory for user rights: on 5 July 2024, the Amsterdam District Court granted, to my knowledge, the first ruling on ‘shadow banning’ and the duty to state reasons under Article 17, as well as the first damages award under the DSA. The decision is only available in Dutch, but this post offers a summary and analysis for an international audience.

The facts

The case was brought by Amsterdam-based legal academic and privacy activist Danny Mekić. Mekić is an active user of X with over 20,000 followers (@dannymekic) who frequently comments on public policy. In the fall of 2023, he posted a tweet criticising the EU’s proposed CSAM Regulation for disregarding user privacy. He then got word from other users on X that they were having difficulty finding his account; it appeared to have been delisted from X’s search suggestion function. But no communication was forthcoming from X, in an apparent breach of Article 17 DSA’s duty to notify and explain moderation actions to the affected user.

In the following days, Mekić sent X several emails seeking clarification. A month later X responded with a generic policy statement (i.e. ‘X has automated mechanisms to analyze posts which may be associated with Child Sexual Exploitation (CSE), and subsequently may restrict those posts ’reach. This can cause individual posts associated with an account to be surfaced with temporary account-level restriction…’) – neither confirming nor denying the moderation status of Mekic’s account specifically. Finally, in January 2024, Mekić received a letter from Twitter’s lawyers confirming that his posts about the CSAM regulation had triggered a visibility restriction and that this restriction had been reversed following an internal review.

The proceedings

Mekić brought his case to the Amsterdam District Court, representing himself via the European Small Claims Procedure. Here he sought, in short: (1) a ruling that X was in breach of contract and acted unlawfully due to violations of the DSA, (2) an order requiring X to appoint a point of contact pursuant to Article 12 DSA, (3) compensation for the non-performance (wanprestatie) of X’s Premium service, estimated at $1.87, and (4) an order requiring X to comply with its obligations under Article 17 DSA by notifying and explaining their moderation actions taken against Mekić and to comply with the service agreement by reversing his account restrictions.

All claims were granted except the fourth (in short because the Mekić had already received an explanation in the course of the proceedings and the erroneous delisting had already been reversed). In a separate proceeding, Mekić also sued X on data protection grounds for failing to comply with a data subject access request. The court also ruled on this decision, but it is yet to be published.

A preliminary issue in this dispute is the scope of the European Small Claims Procedure, which only allows for minor claims of less than €5000. The damage claim of $1.86 falls well below this threshold, but X argued that Mekić’s non-monetary claims (the establishment of a contact point, the justification of search delisting) involve costs that far exceed this limit. The court did not accept this argument. In this commentary, I will leave these procedural aspects aside and focus on the merits of the case.

Discussion

Most litigation from end-users against platforms in the Netherlands and other EU countries has focused on the question of wrongful moderation and claims to put back or reinstate moderated content or suspended accounts. This case is special in the sense that X readily admits that their action was erroneous, and that reinstatement has already occurred.

Instead, this case hinges on the legal status of search delisting as a moderation remedy: X’s primary defence is that search delisting is neither a breach of their Terms of Service nor a form of moderation subject to due process safeguards under Article 17 DSA. The court’s decision therefore examines contract and consumer law as well as the DSA.

 

Search delisting as a breach of contract?

Regarding the breach of contract, X argues that discoverability in X’s search functions is not an ‘essential functionality’ of the service. Furthermore, the platform retains the right in its general terms and conditions to modify access to functionalities and other obligations under the agreement. But the court deems this provision to be non-binding by reference to the Unfair Terms Directive, which prohibits consumer terms that authorize traders to unilaterally amend or suspend their service without valid reasons. The court also observes that Mekic’s ~20,000 followers only amount to a fraction of X’s total userbase of ~64 million, suggesting that algorithmic discoverability amongst non-followers is indeed an important component of the service.

Article 14 DSA also seems relevant here, though the court doesn’t discuss it. This provision requires intermediary services to include clear information in their general terms and conditions about “any restrictions they impose on the use of their service concerning information provided by the recipients of the service” in the context of content moderation. And the DSA’s definition of content moderation explicitly includes “measures taken that affect the availability, visibility, and accessibility” of content. The DSA thus makes clear that visibility is an essential element of the service provided by platforms, and restrictions in this area require clear contractual policies.

 

Search delisting as moderation action under Article 17?

X’s case is even weaker under Article 17 DSA. This provision explicitly states that ‘restrictions of the visibility’ of user-generated content require a clear and specific statement of reasons. It also lists several types of information that this statement must contain (e.g. the nature of the sanction, the legal grounds and factual circumstances relied on, possibilities for redress). The court has little difficulty in concluding that search delisting falls within the scope of this provision and the statements provided by X until now do not satisfy these requirements. In any case, Article 17 DSA requires statements to be made proactively, not after repeated requests from the user.

X also opposes these motivation duties on proportionality grounds. This is a concern I have also raised in my own writing: compliance with Article 17 DSA can be very costly given the enormous scale of content moderation. Additionally, providing justification can theoretically undermine the effectiveness of moderation; more information may help malicious users (e.g., cybercriminals, spammers) to better circumvent existing moderation systems. But apart from limited carveouts for misleading commercial content and for law enforcement, Article 17 DSA does not contain any general exceptions or proportionality tests. The judge, therefore, rejects this proportionality argument since it has no clear legal ground under the DSA.

In my view, such concerns about proportionality are mostly theoretical and non-urgent. In theory, there may come a day when Article 17 DSA becomes so onerous that judges feel the need to find some accommodations (for instance, by applying a de minimis threshold to the concept of moderation actions, such that the least impactful measures are exempted from Article 17 DSA; or through an expansive interpretation of the spam exception under Article 17(2)). But those days of excessive transparency seem far off indeed; at present, there seems to be a serious lack of compliance with Article 17 DSA across major platforms. Mekic’s case is corroborated by countless other academic and journalistic accounts of users being left in the dark as to whether and why they’ve been moderated – especially marginalised groups including Palestinian activists. As an enterprising legal expert, Mekic managed to get some answers. But the big question remains whether platforms will observe these same rights proactively and at scale for all users in the EU, as the DSA demands.

It also remains to be seen how Article 17 DSA will affect sanctions other than search delisting. As I have discussed in some detail elsewhere, some moderation actions are easier to detect than others. Search delisting is relatively easy to demonstrate since targeted search queries can be entered to test whether the account in question is still discoverable. But other visibility restrictions are subtler by design, for instance when they demote the content rather than delist it, and when they act through recommendation features rather than search features. Users seeking to challenge such undisclosed ‘shadow bans’ might face a problematic burden of proof in private litigation, and stand to benefit from regulatory oversight (which, in the DSA’s case, may include scrutiny from auditors and researchers).

 

Damages for wrongful moderation?

To my knowledge, this is also the first case to award monetary damages under the DSA, albeit a symbolic amount here. Mekic only sought $1.87 as pro rata compensation for his subscriber fees to X’s Premium features, for the duration of the search delisting. Accordingly, the court did not address whether wrongful moderation and undue visibility restrictions might also justify damage claims as such, even from non-paying users. Neither does the court address whether damages might be available as such for breaches of Article 17 DSA– i.e. shadow banning – even if the underlying moderation decision is deemed correct.

A recent decision from Belgium suggests that these damages could be substantial. A Belgian MEP, Tom Vandendriessche, was recently awarded € 27.279,03 in damages due to wrongful and undisclosed visibility restrictions imposed by Meta. The facts of this case predate the DSA, however, and the decision is instead based on the GDPR’s automated decision-making provisions. (Any takers for a DSAO blog contribution on this important case?) The GDPR’s damages provision is not identical to the DSA’s, however, and it remains to be seen how this case law develops.

Conclusion

 

Amidst the steady stream of new initiatives and reports from the European Commission and DSCs, it’s easy to forget that the DSA also has significant implications for private litigation. Much like Max Schrems before him – the famous GDPR litigant – trailblazers like Danny Mekic are using EU law to take on big tech and set key precedents for user rights. Who else will follow in these footsteps? The DSA affords ample opportunities for litigation not just for enterprising legal academics but also for CSOs, activists, and commercial influencers. To fuel litigation and strengthen DSA compliance, an important factor may be the opportunity to claim damages for wrongful moderation, above and beyond the symbolic amounts now awarded.

Litigants will also be looking to the DSA’s oversight bodies to ensure that their user rights victories are implemented at scale. Regulators, auditors and vetted researchers have an important role to play here since only they possess the data access privileges necessary to establish compliance at a systemic level. On shadow banning specifically, one important opportunity will be the Commission’s ongoing proceedings against Facebook and Instagram, which include an investigation into their compliance with Article 17(1) DSA and transparency and user redress obligations regarding the demotion of political content.

Finally, this case also offers a cautionary tale for the EU’s proposed CSAM regulation, which contains aggressive new duties to detect CSA content. Digital rights critics have roundly criticised this proposal for its disproportionate risks to freedom of expression (as well as other rights such as privacy), warning that intermediaries are far less effective at detecting harmful content than legislators often seem to think. I can think of no better example than this case, where even legal-academic criticism of the proposal itself is wrongfully moderated as CSAM. It’s a credit to the DSA that its new user rights can now be leveraged to detect and correct some of these errors. But as long as the EU continues to heap excessive moderation duties on platforms they will be exacerbating the underlying problem. The DSA might stem the bleeding but prevention is better than cure.

 

This contribution is based on a Dutch-language case note, which is forthcoming in Mediaforum.