Reclaiming the Algorithm: A call for social media interoperability
By Katarzyna Szymielewicz, Panoptykon
This piece is the second in a two-part series on reforming recommender systems for democratic resilience and the public good. It examines social media interoperability as an ambitious policy objective that could open the market to competing public-interest algorithms, and outlines regulatory pathways for achieving this— not only through the Digital Services Act (DSA), but also the Digital Markets Act (DMA) and Digital Fairness Act (DFA).
In the current geopolitical landscape, fixing the logic of VLOPs’ recommender systems is as much a political as a legal challenge. There are many potential roadblocks in this effort—on the side of the enforcer (risk avoidance), on the side of the courts, once administrative measures are appealed (conservative interpretation of the DSA), and on the side of the platforms (circumvention and evasion).
In this context, the European Commission should consider a more radical and, at the same time, more promising approach than endless legal battles with tech companies: introducing vertical interoperability and opening the European social media market to recommender system algorithms provided by independent players.
What would happen if we deprived VLOPs of their monopoly over algorithmic content curation? In “Algorithmic Pluralism: Towards Competitive & Innovative Information Ecosystems”, Sherif Elsayed-Ali and Robin Berjon make the case that greater pluralism in recommender systems is essential to a healthy information environment. By enabling interoperability between platforms, users could choose among competing algorithms and move between providers without losing their data or networks—opening social media to competition and reducing the concentration of private control over information flows.
With an open marketplace for content curation services, VLOPs would no longer be the only arbiters of quality and credibility. Having been granted access to large platforms’ social graph and content data layer, independent providers could design better user experiences, including content curation that (actually!) serves the public interest, rather than the narrow interests of shareholders or advertisers. For example, algorithms promoting social dialogue and quality journalism.
Do we have legal tools to force-open social media platforms? While there are many legislative and political possibilities, below I suggest three steps that are most obvious:
A three-step legislative roadmap to social media interoperability
(1) Mandate social media interoperability in a revised Digital Markets Act
The three-yearly review, required by Art. 53 of the DMA, specifically requires the European Commission to assess whether the Art. 7 (messaging and (video) call interoperability) should be extended to social networking services. The review proposal is expected in 2026, and the European Commission has already carried out public consultations in the preparation of it.
In Panoptykon’s response to the consultation, we enthusiastically argued that yes(!), the scope of Art. 7 interoperability should certainly extend to social media—and not only that. On top of basic interoperability obligations (Art. 6(7) and Art. 7 of the DMA extended to social networking services), the revised regulation should enable functional separation of the different segments of social media platforms. It should go as far as allowing dominant platforms’ users to replace core platform functionality, such as content ranking algorithm, with an independent service.
In short, DMA revison gives the Commission an opening to oblige gatekeepers to provide fair, reasonable and non-discriminatory access for competitors who are willing to provide competing or complementary social media networking services.
(2) Introduce client-to-service interoperability and a right to constructive optimisation in the Digital Fairness Act
Judging by the scope and examples used in the Commission’s own questionnaire, the proposal for the Digital Fairness Act (expected in Q4 of 2026) is meant to deal with manipulative design and unfair personalisation (ideally closing the loophole in Article 25 of the DSA, which prohibits online platforms from using dark patterns but excludes practices covered by the UCPD and GDPR). Why not use this opportunity to empower European consumers to use software of their choice to interact with social media platforms that have failed to provide them with fair treatment for so many years?
Client-to-service interoperability is relatively easy to implement and already quite popular (see e.g. third-party Reddit and Twitter clients). It opens a possibility of building on top of the service, offering a better experience, greater convenience and at least partial protection against product quality degradation or user manipulation (since it’s much harder to employ dark patterns on third-party clients). This solution, perhaps more than any other discussed in this piece, has the potential to quickly, visibly and tangibly improve the user experience on social media and other platforms.
In addition, the Commission should follow the recommendation of leading consumer law experts, convened by BEUC, and introduce a new right to “fair optimisation.” As argued by Natali Helberger and others in “Digital Fairness for Consumers”:
Manipulative tactics undermine the opportunity for end-users to exert deliberate control over their digital environment (self-determination). Moreover, when end-users are manipulated to act in accordance with the operator’s interests, they are also denied the opportunity to have their desires and interests heard and recognised (self-development). (p. 50)
According to Laurens Naudts and other authors of “A Right to Constructive Optimization: A Public Interest Approach to Recommender Systems in the Digital Services Act”, end-users should have access to meaningful alternative options in how a recommender system functions (including choices among operators and personalised services that meet consumer expectations). From the Digital Fairness Act perspective, it could also offer consumers the most effective protection from algorithmic manipulation and digital asymmetry (which is obviously needed on the largest social media platforms, but is certainly not limited to this space).
(3) Test user empowerment and risk mitigation potential of the Digital Services Act
Last but not least, we have the DSA, as the cornerstone of online platforms’ accountability vis-a-vis their recommender systems. Even though it does not speak explicitly about third-party content curation services, it does contain a few useful articles, which are yet to be fully explored by the Commission.
Before we move any further, the Commission should enforce Article 27(3) against non-compliant platforms. While most VLOPs have introduced non-personalized alternatives to their recommender systems “based on profiling” (following Article 38), they made these options hardly accessible or non-sticky. In the next step, the Commission should take a more expansive reading of Article 27 to promote offering users a choice of a third-party content curation service (or a third-party recommender system) in addition to offering them a non-profiling version of VLOPs/VLOSEs’ own recommender system.
At a minimum, users of social media platforms should be given a functionality to curate their profiles, including what personal information of theirs can be used, and train content curation algorithms to follow their preferences and interests (which may change over time).
These functionalities can also be ‘encouraged’ by the guidance on mitigation measures, still to be issued under Article 35. Offering users a choice of a third-party content curation service would make perfect sense as a recommended mitigation measure for systemic risks caused by engagement-based recommender systems, be they related to mental health, civic discourse or democratic processes.
Finally, if any of the above seem too far-fetched, the DSA revision—planned for 2027—will be an opportunity to clarify obligations related to recommender systems. In particular, Article 27 could be strengthened to give platforms’ users real control over the functioning of recommender systems, turning social media platforms into a service they once promised but never delivered (remember ‘Perfect Personalized Newspaper’ by Marc Zuckerberg?)
Conclusion: What’s at stake
This essay sketches several possible pathways for reform. If these scenarios sound interesting enough to discuss, I encourage you to read my working paper and share your thoughts.
***
When it comes to reforming recommender systems in the public interest, we won’t know which approaches work better unless somebody tries. The question I have for European decision makers ( especially those who, having adopted the European Democracy Shield, seem to understand the stakes) is this: Can we afford not to try?
Faced with the everyday cognitive war on platforms that used to be called “social media” but no longer deserve this name, and with the threat of traditional war on our Eastern border, we can no longer tolerate how global online platforms are used to undermine our democratic resilience and social cohesion. In the EU, we must find a way to fix their recommender systems—whether by forcing VLOPs to do so, or by taking that power away from them.
We can spend large amounts of public money on media education, fact-checking, and support for quality journalism. But the truth is that the most effective way to prevent disinformation and strengthen social cohesion still leads through social media. Real social media that connect, not divide. Services that place user value and the public interest at the center, instead of shareholder or advertiser value.
