How Have Platforms Addressed Addictive Design Under DSA
By Cecilia Isola
SERICS (Security and Rights in Cyberspace); University of Genoa
This piece is part of a series with Tech Policy Press featuring articles adapted from selected papers presented at the second DSA and Platform Regulation Conference, an event marking two years since the Digital Services Act came into full effect.
This year, the European Commission has intensified its scrutiny of addictive design under the Digital Services Act (DSA). On Tuesday, the European Commission opened formal proceedings against Shein, focusing on risks linked to engagement-based design mechanisms, including systems that reward user activity through points or similar incentives, and on whether the company’s mitigation measures were adequate.
Earlier this month, in its preliminary findings regarding TikTok’s alleged addictive design practices, the Commission identified design features such as ‘infinite scroll’, ‘autoplay’, persistent push notifications, and ‘highly personalized recommender systems’ as elements capable of harming the physical and mental well-being of its users, including minors and vulnerable adults.
The Commission’s recent proceedings signal that addictive design has become a central focus of regulatory scrutiny in the EU.
What is addictive design?
What, then, is addictive design? Although the term is increasingly used in policy and regulatory discourse, it still lacks a precise definition. It is often referred to as a specific type of ‘dark pattern’ (Esposito, F. & Ferreira, T.C., 2024; Santos, C., Morozovaite, V., & De Conca, S., 2025), namely an interface design technique aimed at influencing user behavior in the provider’s interest (DSA, Recital 67). Yet, this description captures only part of the phenomenon.
Addictive design is better understood as a configuration of the service as a whole. In this sense, addictive design emerges where design features, user data, and algorithmic systems are combined within a broader logic of engagement maximisation, designed to keep users interacting with the service for as long as possible. This is particularly evident in some popular online platforms, especially social networks and, more recently, online marketplaces. Within this configuration, specific design features ensure that users’interaction with the service remains continuous and frictionless (e.g., infinite scroll, autoplay), while also stimulating re-engagement (e.g., push notifications). At the same time, user data is collected and processed by algorithmic systems to identify which types of content are more likely to attract and retain a particular user’s attention and to organise content exposure accordingly, selecting and prioritizing material with a higher probability of being perceived as rewarding.
Here, “reward” refers to the positive outcome that may follow a user’s action within the platform, for example, social recognition resulting from a like, or interesting content following continued scrolling. Crucially, users do not know when, or even whether, such a reward will occur. It is precisely this uncertainty, comparable to variable reward mechanisms such as slot machines, that increases the likelihood that interaction will continue (Esposito, F. & Ferreira, T.C., 2024).
A growing body of empirical research indicates that such dynamics may significantly affect mental well-being. From a psychological perspective, intermittent and unpredictable rewarding stimuli activate dopaminergic processes involved in anticipation and behavioural reinforcement. Over time, exposure to intermittent and unpredictable rewards increases the likelihood that users will repeatedly perform the same action in anticipation of a positive outcome. This repeated engagement, reinforced under conditions of uncertainty, may gradually reduce users’ capacity to disengage, contributing to excessive or compulsive use and, in some cases, to addiction.
Addictive design as a “systemic risk” under the DSA
The Digital Services Act constitutes the central pillar of the European Union’s new regulatory framework for digital services. Adopted in 2022 and fully applicable since February 2024, it lays down harmonized rules on liability, due diligence obligations and enforcement for online intermediary services provided in the EU.
The DSA adopts a layered approach under which obligations become more severe according to the role, size and societal impact of the service. The most demanding obligations (Articles 33-43 DSA) apply to Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), defined as intermediary services that reach at least 45 million monthly active recipients within the Union.
Particularly relevant in the context of addictive design are Article 25 and Articles 34–35 DSA. Article 25 introduces a general prohibition applicable to providers of online platforms, preventing them from designing or organising their online user interfaces in a manner that deceives or manipulates users or otherwise materially distorts or impairs their ability to make free and informed decisions. By contrast, Articles 34 and 35 apply exclusively to VLOPSEs and concern the assessment and mitigation of systemic risks that may arise from the design, functioning and use of their services. In particular, they require these providers to identify such risks and to adopt appropriate, effective and proportionate mitigation measures.
Although “systemic risk” is not defined, Article 34 DSA highlights four categories: dissemination of illegal content or illegal activities; impacts on fundamental rights; risks to democratic processes and public security; and risks to public health, minors, and users’ physical and mental well-being. While addictive design is not explicitly mentioned, the latter category strongly indicates that engagement-driven architectures capable of producing addictive effects fall within its scope. This interpretation is supported by Recital 83 DSA, which, although not reflected in Article 34 itself, provides interpretative guidance on the fourth category of risks and explicitly recognises that risks to users’ physical and mental well-being may arise “from online interface design that may stimulate behavioural addictions of recipients of the service”.
Moreover, concerns regarding addictive design have been explicitly articulated at the political level. In its Resolution of December 12, 2023, the European Parliament raised specific concerns about the growing prevalence of online platforms’ digital architectures with addictive potential. The Resolution identifies design features such as “endless scrolling”, “pull-to-refresh” page reload, “never ending auto-play” video features, personalized recommendations, “recapture notifications” as capable of fostering excessive or compulsive use, thereby generating risks for mental health and well-being. Significantly, it observed that, despite the entry into force of the DSA and other regulatory instruments, addictive design remains insufficiently addressed in EU law and, among its recommendations, called on the European Commission to address addictive design within the framework of Articles 34-35 DSA.
Is addictive design sufficiently addressed in VLOPs’ risk assessment reports?
The risk assessment reports of Instagram and Facebook (operated by Meta) and TikTok (operated by ByteDance) offer a concrete lens through which to observe how VLOPs interpret and operationalize addictive design as a systemic risk in practice, and how corresponding mitigation measures are articulated pursuant to Article 35 DSA.
A comparative reading of the 2023–2025 reports reveals both convergence and divergence. All three platforms adopt broadly similar procedural frameworks for risk assessment, structured around risk identification, evaluation of severity and likelihood, tiering, mitigation, and engagement with experts, drawing on normative sources (e.g., the DSA, the GDPR) as well as other reference frameworks, such as international standards (e.g., ISO/IEC). However, significant differences emerge in the substantive categorization of risks, particularly with regard to potential harms to mental well-being and addiction.
Instagram and Facebook do not address addictive design as a distinct systemic risk in their reports. The two reports, which appear substantially identical on this point, refer to mental well-being only in the introductory section entitled “Systemic Risk Landscape”(Instagram & Facebook Risk Assessment Reports, 2025). There, it is stated that: “Physical and Mental Well-Being is not listed as its own Systemic Risk Area in our Systemic Risk Landscape as potential impacts to physical and mental health are applicable to all Problem Areas and risks. Our Severity Principles consider Harm Type, which encompasses impacts on users’ physical and psychological health, and we evaluate these risks as part of our Inherent Risk calculation within our Severity Rubric. Higher severity scores are assigned to risks deemed to have a potentially elevated impact on an individual’s physical and psychological well-being. Meta also continuously develops resources to help support our users’ well-being (…)”.
However, the reports do not subsequently develop a dedicated analysis of mental well-being risks specifically linked to engagement-driven design features, nor do they outline concrete mitigation measures addressing harms potentially arising from excessive or compulsive use (e.g., screen-time management mechanisms).
Conversely, there is evidence that TikTok has gradually incorporated risks related to users’ mental well-being into its systemic risk taxonomy. To illustrate this, in its 2023 systemic risk assessment report, TikTok identified ten systemic risks, none of which explicitly addressed risks associated with excessive use or addictive patterns of engagement. In 2024, however, the platform introduced a new category of systemic risk labelled “Risk related to age-appropriate content and online well-being”. To mitigate the risks associated with this new category, TikTok proposed several measures. These included default screen-time management tools set at 60 minutes per day for minors (while remaining optional for adults), time-delay interventions aimed at interrupting prolonged usage sessions, updates to TikTok’s Community Guidelines, informational feeds designed to promote awareness of online well-being, and youth-oriented initiatives such as the establishment of a Youth Council.
Notably, only a few months earlier the Commission had opened formal proceedings against TikTok, raising concerns that certain design features of the platform could contribute to addictive patterns of use, particularly among minors. Although the VLOP does not explicitly link the introduction of this new risk category to the Commission’s investigation, the temporal proximity between the two developments suggests that the increasing attention to online well-being risks is unlikely to be incidental and may instead reflect growing regulatory pressure on platforms to address the behavioural effects of their digital architectures.
Finally, in its 2025 report, TikTok further expanded its systemic risk taxonomy to twelve categories (one more compared to the 2024 report, with the new risk related to illegal and unsafe products). In relation to (supposedly) addictive design, the risk previously framed in terms of online well-being was reformulated under the heading “Youth Safety and Online Engagement: Risk related to online engagement”. However, apart from the introduction of additional specialised teams tasked with monitoring safety and well-being and contributing to policy development and risk prevention, the mitigation measures addressing engagement-driven harms remained largely unchanged.
Addictive design in Commission’s enforcement so far
Formal proceedings initiated by the European Commission are currently ongoing concerning the assessment and mitigation of risks associated with addictive design on TikTok, Facebook and Instagram (for which formal investigations were opened earlier in 2024), as well as Shein (for which an investigation was opened in February 2026). To date, preliminary findings have been published only in relation to TikTok.
The Commission found that TikTok did not adequately assess how its addictive features could harm the physical and mental well-being of its users, including minors and vulnerable adults. According to the Commission, certain design features of TikTok fuel the urge to keep scrolling by constantly “rewarding” users with new content and may place users in an “autopilot” mode. Additionally, in its assessment, TikTok allegedly disregarded important indicators of compulsive use of the app, such as the amount of time minors spend on TikTok at night, the frequency with which users open the app, and other relevant behavioural signals.
On that basis, the Commission preliminarily concluded that TikTok failed to comply with Articles 34–35 DSA, as it “[TikTok] seems to fail to implement reasonable, proportionate and effective measures to mitigate risks stemming from its addictive design.”
The tension between the DSA and VLOP’s business model
At its core, this analysis reveals a significant tension between what the DSA requires and how VLOPs approach addictive design. That tension can only be understood by looking at the business model of online platforms.
Most online platforms, and social media in particular, operate under advertising-based business models. Users do not pay directly; revenue derives from the sale of advertising space. In this model, attention is the key resource (Heitmayer, M. 2025). The longer users remain engaged, the more advertising can be delivered and the more behavioral data can be collected, increasing the value of targeted advertising.
The very features identified as generating systemic risks are often the same features that sustain the platform’s business model. To illustrate, design techniques associated with excessive or compulsive patterns of use, such as infinite scrolling, personalised content feeds driven by recommender systems, and intermittent reward mechanisms, form an integral part of engagement-maximisation strategies that underpin advertising-based platform economies.
As a result, measures aimed at mitigating these risks, as mandated by Article 35(1) DSA, may directly affect the core logic of the service. In other words, any measure capable of limiting patterns of use may simultaneously affect the economic model of platforms whose revenues depend on maximising user attention.
In this respect, Article 16 of the Charter of Fundamental Rights of the European Union protects the freedom to conduct a business, which includes the freedom of undertakings to organise their economic activity and pursue commercial strategies within the limits of EU law. Any restriction must comply with Article 52(1) of the Charter: it must be provided for by law, respect the essence of the right, and satisfy the principle of proportionality. Economic freedom, however, is not absolute and must be balanced against other protected interests, including public health, consumer protection, data protection and fundamental rights more broadly. Accordingly, no hierarchical primacy between competing rights can be presumed in the abstract; rather, conflicts must be resolved in accordance with the principle of proportionality.
It follows that the mitigation of risks associated with addictive design brings to the fore the need to balance competing interests between, on the one hand, the regulatory objective of protecting users and the public interest and, on the other hand, the economic freedoms of VLOPSEs under EU law.
The 2025 Commission’s decision against X illustrates this balancing exercise. The proceedings concerned the design of X’s verification system (‘blue checks’), which the Commission considered capable of misleading users and therefore in breach of Article 25 DSA, which prohibits deceptive or manipulative interface design. X argued that the Commission’s finding of a violation of Article 25 DSA constituted an unjustified interference with its freedom to conduct a business. Relying on CJEU case law, the Commission rejected this claim, emphasizing that entrepreneurial freedom must be understood in light of its social function and may be limited where the restriction is lawful, necessary, and proportionate. In this context, the Commission considered that requiring changes to the verification system did not undermine the essence of X’s economic activity but constituted a proportionate regulatory intervention aimed at safeguarding users’ rights. Article 25 of the DSA was thus framed as a proportionate constraint designed to protect users from deceptive interface practices without unduly interfering with economic freedom.
Key takeaways
Overall, the analysis suggests that addictive design, although not explicitly defined in the DSA, qualifies as a systemic risk within its regulatory framework and must therefore be addressed under Articles 34–35. Yet, it remains substantively under-addressed in VLOPs’ systemic risk assessments.
It must be acknowledged that platforms increasingly refer to mental well-being and prolonged use concerns; however, their compliance often appears largely symbolic (Schramm M., 2025; Griffin, R. 2023) rather than genuinely responsive to the risks generated by potentially addictive design features, as reflected in the Commission’s preliminary findings in the TikTok case.
The Commission’s findings are significant in this regard for two reasons. First, they explicitly confirm that addictive design should be considered a systemic risk under Article 34 DSA and therefore falls within the scope of the risk assessment obligation imposed on VLOPs and VLOSEs. Second, they clarify the requirements of Article 35 by giving concrete substance to the standard of mitigation measures applicable to addictive design.
In this respect, TikTok’s methodology was found to be inadequate, despite its efforts over the years to refine its assessment framework, including by following risk assessment methodologies established by international standards such as ISO/IEC. The Commission’s preliminary findings indicate that generic or user-side tools, such as default screen-time limits for minors, time-management prompts, informational resources, and the establishment of specialised internal teams tasked with safety and mental well-being oversight, are insufficient where the risk is embedded in the very configuration of the service.
The Commission’s enforcement action is therefore beginning to narrow the gap between formal compliance and substantive mitigation of engagement-driven harm. However, this enforcement trajectory remains incomplete, and it is still uncertain whether it will ultimately lead to concrete changes in platform design.
