Here is why Digital Services Coordinators should establish strong research and data units
In short
To detect and mitigate infringements of the Digital Services Act (DSA), member states’ Digital Services Coordinators (DSCs) need a deep understanding of how platforms work and what potential risks are associated with them. Considering also that the DSA contains a multitude of reports and databases to monitor and analyze, member states should equip their DSCs with research and data units that can develop knowledge on platform risks, conduct data analyses, participate in expert communities and support EU-level enforcement efforts.
Introduction
Under the EU’s Digital Services Act (DSA), regulators and researchers will be able to request data from certain online platforms. Very large online platforms and search engines (VLOPs), meaning services with more than 45 million monthly users in the EU, have to provide data for external experts to analyze, so that they can better understand and mitigate potential systemic risks associated with platforms’ services. While these data access rules have received a lot of attention as a potentially innovative way to both oversee and study VLOPs, they are far from the only way that data analysis will play a big role for enforcing the DSA. The law is a “data-gathering machine”, containing more than 50 references to mandatory or voluntary transparency and evaluation reports, databases, activity reports, guidelines, codes of conducts and standards. Regulators will have to expand their data science expertise to deal with various types and large amounts of data. More importantly, to properly use any data to check companies’ compliance with the DSA, oversight bodies need a profound understanding of platform logics and platform risks. This does not only apply to the European Commission, which oversees key parts of the DSA for VLOPs, but also to the member states. Their respective Digital Services Coordinators (DSCs) should therefore include a research and data unit to develop platform-specific oversight expertise and capabilities.
Setting up a research and data unit could help DSCs face a diverse set of tasks that requires them to be able to analyze and evaluate various data from platforms such as social media sites, online marketplaces and search engines as well as trusted flaggers and other organizations. In their respective countries, DSCs will coordinate regulators from various fields such as data protection and consumer protection, allow people to file complaints about possible infringements, accredit trusted flaggers and serve as the regulator for many online platforms, for instance, by receiving transparency reports and other data (for an analysis of the DSCs’ tasks, see chapter 2 here). They cooperate at the EU level, for instance, in a joint advisory body (the European Board for Digital Services) and via joint investigations. DSCs also support the Commission, which is mainly in charge of overseeing compliance with many key rules for VLOPs (for an overview of the division of labor between the Commission and member states, see figure 2 here). Furthermore, those DSCs with VLOPs established in their countries can request data from these platforms to check compliance with the DSA. All of this would be easier to handle with a capable research and data unit at the DSC.
To be sure, the DSC is neither a governmental research institute like Germany’s Federal Institute for Population Research nor a governmental research funder like the French National Research Agency nor a governmental research advisory body like the Research, Development and Innovation Council in the Czech Republic. However, as will be argued here, it would be helpful for any DSC to incorporate elements of these different types of research-related activities into its work: it needs to conduct, fund and coordinate platform-related research. In practice, this would entail a substantial research budget as well as a specialist research and data unit. Having such a unit would allow the DSC to:
- establish deep knowledge of platforms and especially related systemic risks,
- support and push the Commission’s enforcement efforts,
- build, participate in and partly fund an expert community on platform oversight and,
- make the best use of the special data access provisions the DSA offers to them (if applicable).
The following sections will consider each of these four cases for a research and data unit. At the end of the text, the annex further specifies these reasons, referencing the articles from the DSA.
1. The baseline: Developing a deep understanding of systemic platform risks
A research and data unit at the DSC would be an important asset to build specific knowledge on platform regulation. While DSCs should draw from existing expertise at other regulators (for instance, data protection authorities), the DSA contains provisions that are unique to this new regime and are not covered by any existing regulator in most EU countries. Examples include platforms’ reports on content moderation (Articles 15 and 24), their design choices (Article 25) or their explanations for recommender systems (Article 27) as well as the accreditation of trusted flaggers (Article 22).
Over the short term, a research and data unit would be beneficial as a starting block to get DSCs up and running on platform oversight. This would acknowledge that although they do not start completely from scratch, DSCs are new regulatory entities in a new oversight field and will face a steep learning curve. In contrast, if the DSA had only tweaked existing rules in existing regulatory fields, a dedicated research and data unit might not be as vital. Initially, such a unit could study specific oversight issues and what type of technical resources, expertise and networks are necessary to handle them.
Over the long term, this knowledge built by the research and data unit would help DSCs to check compliance with the DSA. If they want to detect and mitigate infringements of the DSA, DSCs will have to possess a deep understanding of the underlying risks regarding content moderation practices, algorithmic recommender systems or platform design. With its strong focus on “systemic risks”, the DSA requires oversight bodies to grasp such risks emanating from various platforms that might have different effects for certain groups – especially because the concept of “systemic risk” in the DSA needs to be further specified.
Assessing and mitigating systemic risks is only asked of VLOPs. The Commission is tasked with checking compliance with these and other rules that apply to VLOPs. Yet, there are numerous requirements or options for DSCs to become involved on systemic risk assessments, too. Formally speaking, DSCs are asked to develop and share expertise on “systemic and emerging issues” (Article 64). This is not limited to issues surrounding VLOPs. Rather, developing platform-specific knowledge is the baseline for all regulatory action. When it does come to VLOPs, DSCs can request documentation from very large online platforms in their country (Article 34(3)), they must report on risk mitigation measures (Article 35(2)) and can support the developments of guidelines on this (Article 35(3)). This is where dedicated research and data units could provide big benefits to the DSCs’ oversight work.
Imagine the case that a company reports on specific risks of its platform regarding regional or linguistic aspects (Article 34(2)), say, discriminatory speech against particular minorities in a regional dialect. While the Commission might have some expertise on this, it is the respective member state which could offer the European Board for Digital Services in-depth knowledge for its systemic risk report (Article 35(2)). It could present empirical evidence collected and analyzed by the research and data science unit and put into the member state’s legal, historical and linguistic context.
Beyond systemic risk mitigation, strong data analysis skills, paired with experts from other diverse disciplines such as psychology, human rights and computer science, can also facilitate an analysis of the various reports and other big data sources the DSA includes.
2. The EU level: Supporting and pushing the Commission
The Commission is mainly responsible for supervising very large online platforms (for an overview of the division of labor, see chapter 2 here). But there are also various instances where DSCs, either individually or via the European Board for Digital Services, can assist the Commission in overseeing very large online platforms. In addition, even if not explicitly provided for in the DSA, Coordinators can offer expertise at the EU level and by doing so nudge the Commission towards action. The Commission, although capable and growing, will not immediately be able to cover all aspects of the DSA in-depth. The better equipped the DSCs are to step up right away, the better enforcement will go.
Take the case of online ad repositories for very large online platforms (Article 39). There have never been mandatory databases collecting all online advertising. It will take some time for the Commission to understand related issues and identify potential breaches, also considering it has a lot of other obligations to oversee. To check whether such a database works and fulfills DSA requirements, expertise from design and engineering (regarding the functionality and usability), data protection (regarding the use of personal data) and also digital marketing and ad tech (regarding an understanding of online advertising generally) might be necessary. With research and data units that can gather and develop knowledge on this interdisciplinary regulatory question, DSCs would be in a good position to contribute to the development of guidelines by the Commission via the Board (Article 39(3)). They could add national and linguistic expertise to the Commission’s EU-wide view on ad repositories.
Besides the Commission, enforcing the DSA at the EU level will also involve the European Centre for Algorithmic Transparency. This is a research institution by the Commission and its Joint Research Centre, established in 2022. It is meant to provide scientific expertise to the Commission regarding oversight of VLOPs. If DSCs and the Board want to understand its research or work with the Centre on an equal footing, they can only do so with profound knowledge of systemic platform risks. Research and data units could provide this, thus again bringing in additional national-level expertise to the Centre’s EU-level research efforts.
3. The expert community: Building and funding a network of platform researchers
Researchers from academia, civil society organizations and news outlets have played a big role in platform oversight in the past. Before the adoption of the DSA, researchers studied online advertising, drew attention to discriminatory algorithms and potential breaches of basic human rights, supported users seeking redress against platforms and explored ways to help people deal with disinformation online, to name just a few examples. The DSA explicitly foresees tasks for researchers and civil society, and implicitly acknowledges their overall importance in EU platform oversight. Thus, the DSA relies on a mix of private and regulatory enforcement from a multitude of actors, including researchers, trusted flaggers as well as out-of-court dispute settlement bodies. Since DSCs are in charge of accrediting and monitoring these actors, they will have to oversee a whole system of various platform oversight organizations. This requires a strong understanding of platform logics and a strong network of platform experts hailing from diverse practical and academic backgrounds such as content moderation, human rights risk assessment, software engineering, behavioral psychology and law.
DSCs could harness external experts’ knowledge and expand existing networks to strengthen their own oversight work. A research and data unit could serve as a hub for platform-related studies, provide the framework for an expert community and, with some safeguards, fund external research. Guardrails are necessary to avoid too many dependencies and let the DSC become the only deciding factor what platform research does and does not get conducted, especially since they also vet researchers. A clear division between vetting and funding as well as strong transparency guidelines could help. Still, some funding activity by the DSC is especially pertinent for long-term, comparative research, for which civil society organizations might often lack sufficient budgets. Going even one step further, in connection with an outreach and science communication team, the DSCs’ research could also help inform the public, for instance, with explanations of recommender systems or awareness campaigns about certain platform risks.
4. The data access provisions: Requesting and analyzing platform data
The DSA’s data access provision (Article 40) is another big incentive to develop data analysis capabilities, especially for “DSCs of establishment”: These DSCs can request data from VLOPs that are established in their respective countries. This only applies to a few DSCs as of now (see table 2) and it must also be mentioned that the Commission can request data from any VLOP. Because the Commission is primarily responsible for enforcing rules for VLOPs, DSCs might take a backseat on data access requests.
Table 2. DSCs that can request data directly from VLOPs
DSC of establishment | Company (VLOP(s)) |
Ireland |
|
Luxembourg | Amazon (Amazon Marketplace) |
Netherlands | Booking (Booking.com) |
Notes: Adapted and updated from Martin Husovec, “The DSA’s Scope Briefly Explained”. Alibaba and Snap run VLOPs (AliExpress and Snapchat, respectively), but since their European headquarters are in the United Kingdom, it is unclear which DSC will be in charge. The DSC for Wikimedia’s VLOP Wikipedia is also unclear.
Nonetheless, especially DSCs of establishment would benefit from having a research and data unit. If these DSCs had both data scientists as well as experts from other fields such as human rights or psychology at hand, they could request, analyze and understand data from platforms. Otherwise, they risk either not using the Article 40 provisions at all or being at a disadvantage towards platforms, not knowing what data they can request or how to adequately judge the quality of the data they received. Not using the data access provisions granted to DSCs of establishment and not being able to help researchers request data would play into the hands of tech companies. They could counter arguments for more openness and transparency by pointing to the lack of data access requests.
Even DSCs with no very large online platforms in their countries can and should use the Article 40 provisions, namely, when evaluating and forwarding data access requests from researchers: Without a good understanding of platforms and a dedicated team of internal and external experts to cross-check researchers’ proposals, DSCs will be at a loss to comprehend and vet data access requests. One exemplary case for the vetting process might be researchers from a Spanish university seeking data from Instagram. Even though Instagram is established in Ireland, the Spanish team could still hand in their request to the Spanish DSC. The Spanish DSC would forward it to Ireland, but only after conducting an initial assessment – which could take on a crucial meaning, if it is a well-argued analysis of the request that does not require further action by the Irish DSC. Coming to such a comprehensive and thoughtful initial assessment is much easier if the Spanish DSC has the knowledge and skills on data analysis and platform risks that a research and data unit can deliver.
Outlook: Building research and data units without creating redundancies
If member states decide to equip their DSCs with considerable research and data units, they can learn from existing bodies and ideas how to do so. National regulators as well as the Commission have scaled up their data analysis work. For instance, competition authorities have built data analysis units, a Dutch agency discussed its efforts to hire more data scientists, the French government has set up a dedicated digital platform research agency called PEReN, Germany has established fellowships to bring in design and engineering expertise to the administration and US lawmakers have proposed bills with specific research units to include “technologists”, “sociotechnical experts” as well as not-for-profit experts in platform oversight. In addition, there are many governmental research institutes outside of the tech sector that can serve as inspirations.
Surveying such examples from their domestic regulatory landscape should be member states’ first step towards building research and data units. An exchange in the emerging EU-level discussion between the Commission and (potential) DSCs could be beneficial to identify good and bad practices. Some of the questions that member states should seek to address are (for additional considerations on the structure of the DSC, see here):
- Budget: Does the unit only need some in-house experts or also the ability to fund outside studies and conduct science communication and community-building?
- Size and diversity of the team: Is the emphasis on a small group of data science professionals or should a larger cast of experts from software engineering, various academic fields and content moderation be assembled?
- Organizational structure: Is the unit a free-standing specialist team to support other DSC departments or nestled within another department?
Different countries will have different answers to these questions, especially because not all member states will be able or willing to spend a lot of money or find fitting staff. That is why it is even more important that at least some member states do establish strong, specialized units so that they can support other DSCs. Such a strong unit would indeed have enough budget for internal and external experts that have a diverse array of academic and practical experiences from law, data science, computer science and machine learning and be agile enough to interact with different other units at the DSC.
Establishing research and data units at the DSCs in this way does not have to create redundancies and overlaps with existing agencies. Quite the contrary, by having dedicated units with strong connections to the Commission, the European Centre for Algorithmic Transparency and other domestic regulators, DSCs will be better suited to avoid duplicate work and spot research gaps.
Whether to equip the DSC with a dedicated research and data unit will ultimately be a political and financial decision as much as a technical one. Beyond the practical reasons given here, building an expert DSC with a strong research focus would also allow member states to underscore their ambition to take enforcing the DSA seriously.
Annex. Different use cases for strong research and data units within DSCs
The following table summarizes the main tasks attributed to the DSCs and maps how robust research and data units within DSCs could help with carrying out these tasks.
What is asked of the DSCs (DSA article) | How a strong data and research unit at the DSC can help |
1. Developing a deep understanding of systemic platform risks | |
Drawing up national guidelines on complaints mechanisms (Recital 39) | – (Data science) analysis of complaints could be used in addition to practical experiences from other DSC units and/or other regulators |
Receiving various reports (e.g., on content moderation, from trusted flaggers) (15(1), 22(3), 24(2), 42(4)) | – Deep understanding of platform risks is a basic necessity for analyzing reports
– Data science capabilities can support other units and/or enable own analyses |
Reporting on systemic risks (via Board) (35(2)) | – Deep understanding of platform risks is a necessary prerequisite for any own reports |
Drawing up standards (via Commission with consultation by Board) (44) | – Providing own empirical evidence
– Groundwork for Commission and Board not possible without deep understanding of the topics mentioned under Article 44(1) a-j |
Receiving information from platforms on infringements, inspecting platforms and interviewing platform employees (51(1)(a-c)); receiving a platform’s action plan to stop infringements (51(3)(a)) | – Without a good grasp of platform risks, potential infringements and action plans cannot be judged properly; risk of deception by tech companies |
Offering users a complaints system (53) | – Substantiated assessment of complaints only possible with deep knowledge of platforms, platform risks and regulatory landscape (otherwise delays/redundancies in cooperation with other regulators) |
Conducting joint investigations with other DSCs (58) | – Deep understanding of platform risks is a necessary prerequisite for joint investigations |
Building up EU expertise (with Commission; especially on VLOPs and emerging issues) (64) | – Identify “emerging issues” (~ early-warning system) and systemic risks, especially in a specific regional/linguistic context |
2. Supporting the Commission | |
Helping the Commission draw up guidelines (25(3), 35(3)) | – Groundwork for Commission
– Providing own empirical evidence – Without deep knowledge on platform risks, risk mitigation measures cannot be judged properly |
Understanding platforms’ risk assessments (via Commission) (34) | – Add to Commission expertise with additional empirical findings (especially within national context) |
Working with online ad databases (39(1), 39(3)) | – Practical regulatory experiences with mandatory online ad databases missing across the EU; DSC units could help develop expertise |
Drawing up guidelines for online ad database (via Board) (39(3)) | – Groundwork for the Board
– Providing own empirical evidence |
Supporting the Commission in drawing up delegated acts on data access (40(13)) | – Informal groundwork for Commission, providing own empirical evidence
– Participation in official consultations |
Supporting the Commission with the development of voluntary codes of conduct (45 (also 46, 47)) | – Groundwork for the Commission
– Providing own empirical evidence |
Offering recommendations to the Commission (via Board) (63(1c)) | – Own research and expert network to support drawing up recommendations |
3. Building an expert community | |
Vetting researchers (40(8)) | – Vetting of formal (especially data protection) requirements
– Prioritization of requests due to own knowledge of platform risks and research gaps – Exchange with other DSCs |
4. Requesting and analyzing data from very large online platforms and search engines | |
Requesting VLOPs’ data (40(1) and 40(2)) | – Identifying and formulating fitting research and regulatory questions
– Formulating actual data requests (technical/legal issues) – (Data science) Analysis and contextualization within economics and social sciences – Networking with external experts (potentially via different DSC unit for science communication and outreach) |
Understanding algorithmic systems and systemic risks of VLOPs (40(3) and 40(4)) | – Developing permanent, diverse knowledge regarding different platforms and platform risks
= basic premise for own data access requests and for understanding the data, for cooperation with DSCs/Commission; partial fulfillment of Article 64 |
Note: Some of the items could be grouped under multiple headings, for instance, Article 40(3) concerns data access, expert community and EU-level work; Article 34 concerns both systemic risk expertise and supporting the Commission.
Julian Jaursch is a policy director at the not-for-profit think tank Stiftung Neue Verantwortung (SNV) in Berlin, Germany. He focuses on platform oversight and has published policy analyses and recommendations on the DSA. The author thanks the experts at SNV and beyond who generously shared their insights on the DSA and is grateful for helpful feedback to a draft of this text from Ilaria Buri, Martin Husovec, Rita Jonušaitė, Claire Pershan and Suzanne Vergnolle.