If at first you don’t succeed: Reflections on a rejected Art. 40 DSA data access request

By Catalina Goanta & Anda Iamnitchi

Article 40 of the Digital Services Act was hailed as a breakthrough for platform research. But what does the the procedure look like in practice? Drawing on their own rejected data access request, the authors reflect candidly on early lessons for the first wave of Article 40 applications, and what researchers should know before applying for access to platform data. Readers are also invited to contribute to an ongoing researcher survey and join a webinar on 20 March to unpack more lessons learned about DSA data access.

Context

If you are a scholar interested in how social media shapes society, the past years have been tough. That’s because when you need data to investigate such impact, and that data is privately held, it can become practically impossible to research this space. After an initial Wild West of data scraping and platform access through bilateral agreements, and Cambridge Analytica, we’re now in a phase where platforms are generally doubling down on the contraction of their data availability. A few examples of contraction are the high fees for API access (e.g. X’s heavily monetized API), the deprecation of resources as a whole (e.g. Meta’s Crowdtangle), or the mysteriously incomplete and buggy APIs (e.g. TikTok’s Research API). This all makes computational researchers hungry for (good) data.

In the meantime, the Digital Services Act (DSA) has been adopted in the European Union (EU). The DSA’s Article 40(4), which deals with data access for vetted researchers to study systemic risks in Europe on large platforms, was seen as a holy grail of renewed data availability. In the past years, law, computer science and computational social science conferences abounded with enthusiasm around this new provision, in the hope that it would remedy the pitfalls of the current data contraction era and the resulting data hunger. From a regulatory perspective, data access for vetted researchers is an innovative transparency measure aimed at alleviating, one study at a time, the power imbalance between large platforms and other stakeholders such as regulators, researchers, as well as civil society as a whole.

The decisions on the first wave of data access requests are starting to become available. Our research team received one of these decisions, which, unfortunately, was a rejection. This blogpost shares some of the many lessons we learned in the process of submitting a data request and understanding the resulting decision. What follows is a mélange of factual details about our application, but also some deep (and vulnerable) reflections on where we went wrong. We hope this combination of facts and honest introspection can be a drop in the ocean toward understanding how Article 40 will/should work in practice. We also hope this can help other researchers not waste time or have unrealistic expectations about this new data access regime.

Our data access request and subsequent rejection

Our research focuses on how content monetization drives commercial and political content on social media platforms. Like many other social media scholars, we started looking into the Romanian presidential elections of 2024, where TikTok monetization (e.g. through hidden influencer marketing and live streaming political content) seemingly helped unfairly amplify political candidates. We interpreted this phenomenon to potentially qualify as a systemic risk according to Article 34(1)(c) DSA (“any actual or foreseeable negative effects on civic discourse and electoral processes, and public security”). Back in December 2024, the European Commission itself had started an investigation against TikTok on election risk, which further supported our framing.

Our team consisted of two computer scientists and one legal scholar, from two Dutch universities. The data access proposal aimed to get data access to a range of TikTok content which was public but generally inaccessible due to API issues (e.g. hashtags-based posts, as well as content from specific accounts), but also data which is was not public (e.g. reshares tracked outside of the platform, reported content and internal guidelines for community standards enforcement). Our research goals were:

  • identifying and quantifying systemic risks arising from monetization of political content through influencer marketing and live streaming;
  • examining transparency practices in the monetization of political content; and
  • analyzing content moderation and self-moderation practices by users and the platform to understand how community guidelines are applied and perceived.

Our data security and management plans followed the research practices we are familiar with as multidisciplinary researchers engaging with computational methods, aligned with the standards we have applied across different successful research grant applications.

Two of the members of the team had previously participated in a data access pilot hosted by the European Research Council, which thoroughly discussed implementation procedures for Article 40 in a multi-stakeholder approach.

Our data access request was submitted on the very day when the DSA Data Access Portal was made available (28 October 2025), and we chose to submit it to the Autoriteit Consument Markt (ACM, the Dutch Digital Service Coordinator, DSC). We received our notification from Coimisiún na Meán (CNAM, the Irish DSC – TikTok is based in Ireland) on 19 February 2026 (around 80 working days, as specified in Article 7 of the Delegated Act on Article 40 DSA).

Our application failed on most accounts (5/7), including:

  • we did not prove vetted researcher status;
  • we did not elaborate enough on being independent from commercial interest;
  • we did not sufficiently prove that we meet security and confidentiality requirements;
  • we did not justify the data and time frames enough;
  • we did not sufficiently explain what we will make public in terms of research results vis-à-vis data protection concerns.

Some thoughts on our rejection

After being involved in the data access debate around Article 40, both in publications and conference discussions (see our paper forthcoming in ICWSM proceedings) but also in policy-making (ERC pilot), we were confident that we generally understood not only the spirit of Article 40 but also how to make it work in practice. We were, obviously, wrong. And here’s what we were wrong about:

  1. Article 40 is not merely a research mechanism, but a legal administrative procedure. Our research team was already on a more conservative side of the debate around what Article 40 is actually for. Given the diverse practices around strict and serious data management across different disciplines, including how little knowledge/practices about this exist in some fields, we were reluctant to champion an interpretation of Article 40 that would make it into just a new data source to guarantee novelty in a top-tier publication for any researcher willing to experiment with it. After all, data that can be accessed through Article 40 is supposed to reveal some of the most pressing vulnerabilities of European society as a whole (e.g. systemic risks around fundamental rights, elections, illegal content). This, in itself, should be seen as the most sensitive research that can ever be done on social media data, which also leads to questions of who should have access to it and under which circumstances.

    What became evident in the process, and especially in the rejection, is that any applicant for data access must look at their own application with legal eyes: they are asking an administrative authority for a legal request which can be attacked in a court of law, and thus needs to be more diligent than what our funding application standards have taught us. In academia, we are at best used to the language, practices and documents of funding processes, which retain a legal nature that remains generally invisible or unaddressed. In the funding world, the application is a conversation mostly between academics, overseen by a public or private organization. Article 40 data access is not that. It is a legal procedure, which should be understood and prepared as such. We could not even prove our vetted researcher status (which in our case had a low burden of proof since we work full time at two universities), because we submitted documents which were not in one of the official languages of the authority that actually needs to make the legal decision, and that was an unfortunate oversight.

    In making our application to the Dutch DSC, we thought would be covered by Dutch administrative law. But we learned that regardless of which body we submitted our application to (either the national DSC or the Irish DSC, given TikTok’s EU base is there), this actually converges into only one legal procedure that will (in our case) exclusively take part under Irish law. The Dutch DSC’s assessment was a non-binding opinion, and not an actual administrative act. This sounds very intricate, but upon reflection, it makes perfect sense, because the DSA achieved a lot, yet it is not a harmonization instrument for administrative law, which remains mainly national.

  2. Article 40 is not an individual researcher application with university backup, but an institutional application with a researcher’s face. Echoing the point above, what we are used to in academia is driving funding applications in a more or less independent manner, with (some) institutional advice and support along the way, depending on the organization. An Article 40 application is not that. As the burden of proof is considerably higher than with funding applications, we have learned that it might be more constructive to see it as an institutional application that is driven by a specific researcher. That means that from the first moment when you consider making a data access application, you rally the troops: your university/faculty’s data protection officer, information security officers, data management experts, and very importantly, the legal officers. That is a line-up to which very few researchers will have readily available communication channels.

    The situation can become even more complicated when a data access request comes from researchers spread out across multiple institutions. Alignment and discussions should lead to a uniform stance on applicable standards (e.g. security), but also actual contracts between parties (e.g. different universities), as well as a joint understanding of how liability can be split if, for instance, there would be a data security incident. All these matters increase the overhead in making such applications, and more standardization and debate are necessary also at the infrastructural, organizational level. All these aforementioned university experts will have to be trained in the complex workings of Article 40, and may sometimes have to gather additional knowledge to best know what to expect (e.g. Irish administrative law, but also platform Terms of Services, platform NDAs, and any other information that might be relevant if researchers can actually reach a stage where a data access contract will be negotiated with a platform).

  3. Article 40 is complex to interpret and apply for, so only use it if you are prepared. One of our mistakes was to ask for data too soon. As is the case with any new regulation that has complex implementation, the reality on the ground is that it will take a long time until the process will be clear, complete and smooth from the perspective of aligning expectations between researchers, DSCs, and also platforms. We submitted our request before a lot of information or guidelines came out (e.g. from the ACM or CNAM), and could therefore not rely on them.

    At the same time, we took the application partly as a way of testing the waters, understanding that a certain amount of testing the interpretation of Article 40 to further improve it is a necessary part of this process. But researchers should think carefully before submitting requests if they are not fully prepared. National administrative authorities, despite their willingness to make Article 40 work, face considerable resource limitations. Especially for platforms established in Ireland (which is true of most VLOPs), this means that most of the administrative burden falls on a single Member State authority. If we submit incomplete applications that are not uber-diligently prepared, we are not only wasting our own time, but wasting space in a long administrative queue — a lose-lose situation for researchers, the wider research community, and the DSCs.

  4. Lastly, Article 40 is not a solution for the academic data hunger problem. Data access contraction is here to stay, and Article 40 will not remedy that, because it is not just another way of getting data from platforms, like scraping or using a research API. It is a path to data access that is like no other, because the burden of proof is also like no other. This might affect one of the essential incentives to get that data in the first place: publishing your results and weighing the tensions between transparency (e.g. for reproducibility) and privacy (e.g. for the protection of user rights). After all, why do academic researchers want access to data? Mainly for scientific publications (although a wider discussion can also challenge this purpose and address whether data access under the DSA is suitable for forensic investigations that might resemble contract research more than academic research).

    It may be that for some kinds of data relevant to systemic risks (e.g. extremely sensitive data), the academic peer-reviewed publication objective might not be suitable for making results public. This can lead to a lot of problems with validating findings and challenging reproducibility through peer-review (contractually the data will not be able to be shared with anyone else than the research team), as well as with fitting into current academic incentives, where peer-reviewed publications make all efforts worthwhile. Our academic incentives, but also the review of highly sensitive research, might need reframing until we can integrate it in our daily life as researchers. Such reframing might entail new forms of peer-review and publication for highly sensitive research.

There is so much more than can be discussed about these procedures. After going through the official rejection letter carefully, we started wondering: are there actually any applications that have been successful thus far? Since the 80-day period is upon us for the first wave of applications, we will soon be able to tell if any applications made it through (also because successful applications should be displayed on the Data Access Portal). If you’re sitting on a successful application somewhere while reading this, kudos to you, send us your university address and we’ll send you flowers, as they are well-deserved.

Article 40 remains a great tool and promise, and these reflections pertain to the early days of its implementation. One of the most important lessons we’ve learned is that if we are to contribute to its success, we ought to take it for what it is – a legal procedure with an incredibly high burden of proof, which reflects the gravity of what you’re asking for: to do research on Europe’s deepest vulnerabilities in the digital space. As researchers, we might contest that burden when it poses obstacles to our work. Yet as citizens reading the news every day, we’d be disappointed with European governance if access to such sensitive platform data did not come with strict safeguards.

If you want to stay involved in these discussions, here are two ways you can do that: we have a survey running on systemic risks, and we’ll host a webinar on 20 March to unpack more of the lessons learned. Let’s join forces to make Article 40 work.

More resources

ACM (Dutch DSC), Access to platform data for vetted researchers under the DSA, https://www.acm.nl/en/publications/access-platform-data-vetted-researchers-under-dsa.

European Center for Algorithmic Transparency, FAQs: DSA data access for researchers, https://algorithmic-transparency.ec.europa.eu/news/faqs-dsa-data-access-researchers-2025-07-03_en.

Coimisiún na Meán, Researcher Survey on Article 40 Data Access, https://www.cnam.ie/app/uploads/2025/09/Researcher-survey-report-FINAL.pdf

Coimisiún na Meán, Article 40(4-11) DSA: Guidance for Applicants, https://www.cnam.ie/app/uploads/2026/02/DSA-Article-40-4-11-Data-Access_Guidance_1.pdf

Coimisiún na Meán, Vetter researchers newsletter, https://www.cnam.ie/industry-and-professionals/online-safety-framework/certifications-schemes/vetted-researchers/vetted-researcher-newsletter/