In between illegal and harmful: a look at the Community Guidelines and Terms of Use of online platforms in the light of the DSA proposal and the fundamental right to freedom of expression (part 3 of 3)

By Britt van den Branden, Sophie Davidse and Eva Smit*

This blog is the third and final part. See blog 1 here and blog 2 here.

After analysing the Community Guidelines and Terms of Use (hereafter: CG and ToU) of six platforms in the first blogpost and applying them to case studies in the second blogpost, it is now time to look at the issues that we found so far in the light of the objectives of the DSA. More specifically, in this third and final blogpost we discuss whether the current DSA provision on terms and conditions, as laid down in Article 12 DSA proposal, effectively tackles harmful content while at the same time guaranteeing freedom of expression in an online environment.

A short summary

In the first blog of this trilogy, we have shown a visual representation of the CGs and ToUs of six platforms – YouTube, Twitter, Snapchat, Instagram, TikTok and Facebook – in the form of a set of comparative tables. It became apparent that the content moderation practices described by the platforms, with their own rules and exceptions, are broadly worded and often rather vague. This was emphasised when looking at the case studies in our second blogpost. Studying the actions taken by the platforms, it came to the fore that although the moderation practices were mostly in line with the CGs and ToUs, they provide little certainty from the perspective of the users. For example, no clear definitions are given of what harmful content entails and it is unclear which enforcement action, and when, will be used. Consequently, a wide variety of content and occurrences can be classified under the prohibitions of the CGs and ToUs, in combination with a somewhat arbitrary enforcement. As we have concluded before, it is therefore questionable whether the users of the platforms at stake could have predicted the actions of the platforms based on the CGs and ToUs.

Foreseeability, accessibility and predictability are important key requirements of the laws under which freedom of expression can be restricted. In matters of platform regulation, therefore, it is important that the CGs and ToUs are foreseeable and predictable for users in order to be consistent with e.g. Article 10 ECHR. The relevant question here is, to what extent can the DSA, and in particular Article 12 on terms and conditions, contribute to more transparency and certainty from the viewpoint of the platform user? And how could this transparency be enhanced?

The DSA and its goals

What are the main goals of the DSA? Among other policy objectives, the DSA aims to create a safer digital environment in which fundamental rights are protected. This includes strengthening the rights of users of digital services. The liability framework of the DSA is consistent with the one set out by the e-Commerce Directive. In addition, due diligence standards and transparency obligations are set out in Chapter III. These newly introduced measures are especially relevant in light of the case studies discussed in the second blogpost, because they aim, among other things, to improve transparency regarding content moderation practices. More specifically and relevant for our research, the DSA aims to regulate the platforms’ terms and conditions by introducing transparency obligations regarding these contractual provisions and due diligence obligations regarding their enforcement in Article 12 DSA. The obligations arising from this provision apply to content which providers of intermediary services consider illegal or contrary to their terms and conditions. The article therefore goes beyond tackling illegal content only.

The first paragraph of Article 12 requires intermediary providers to include information about the restrictions they impose on their service, including information on any ‘policies, procedures, measures and tools used for the purpose of content moderation’. The second paragraph imposes the obligation to ‘act in a diligent, objective and proportionate manner’ when applying the above-mentioned restrictions ‘with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter’. As recently pointed out by IViR researchers, it is however doubtful whether this provision requires intermediary providers to directly apply EU fundamental rights law in their content moderation decisions.

The issue with the current Article 12 DSA proposal on terms and conditions

What we have found so far is that the CGs and ToUs of the platforms we analysed are too vague, especially when it comes to their application in practice. Based on these policies alone, it is difficult for users to predict what content moderation practices the platforms will employ and when. Thus, a logical response to this problem is indeed to impose stricter obligations upon the CGs and ToUs of the platforms, to make them more transparent.

However, we have found that the wording of Article 12 DSA is quite vague in both paragraphs. It for example remains unclear in what manner the requested information should be communicated, other than that it should be in ‘clear and unambiguous language’. This requirement sets a higher threshold compared to the GDPR, in which information should be given in “clear and plain language”. It is however not clear what this difference in wording entails in practice, and more guidance would be needed for the VLOPs to implement this information obligation.

This raises the thorny question whether the proposed provision will actually incentivise platforms to provide users with a clearer scope and definition of the content that is prohibited under their CGs and ToUs. The VLOPs might be of the opinion that they already provide the public with sufficient information about the restrictions they impose on their service, but, as our case studies have shown, it is doubtful whether this is clear enough. In light of this, the current regime does not clarify how much discretion VLOPs have to comply with the information obligation as set out in Article 12. It is doubtful how to reconcile their private nature with the vague reference to fundamental rights in the provision: platforms are private companies in nature and are therefore not bound to international freedom of expression standards. This is, however, a subject of discussion. Although these platforms are private actors, they are often described as gatekeepers of news and information. Their services are used to exchange information and ideas and to discuss public matters. These privately owned platforms therefore do operate in the public sphere, which is often denoted by their ‘quasi-public functions’.

Moreover, one of the risks of the current liability regime is that of over-removal, which results from platforms choosing the most risk-avoiding approach and therefore taking the content down when in doubt. Although the due diligence obligation in the second paragraph of Article 12 could contribute to limiting the problem of over-removal, it is questionable whether this provision will counteract the problems that are caused by the current ToUs and their inconsistent enforcement: uncertainty, unpredictability and their impact on freedom of expression online. It is therefore up for debate whether freedom of expression is duly protected when tackling harmful content, and with that, if one of the primary aims of the DSA is met. In fact, concerns have been raised that the subjective wording of ‘diligent, objective and proportionate’, will only increase the number of dissatisfied users.

Possible solutions to Article 12 DSA proposal

It might seem like the solution should lie in adjusting the wording of Article 12. If Article 12 DSA would entail more stringent requirements in terms of transparency, this might be beneficial for the expectations that users might have of content moderation in practice based on the CG and ToU. However, platforms could perhaps end up moderating content more extensively than necessary in order to guarantee that they meet those (more specific) contractual terms. This could potentially lead, again, to over-removal.

On the other hand, the current open wording is far from ideal either as it remains unknown how the article should be interpreted and what obligations it imposes. This is especially problematic regarding the Charter of Fundamental Rights as mentioned in the second paragraph. According to ongoing debates, two interpretations are possible: VLOPs should simply take note of the Charter, or the Charter would become directly applicable in the relation between the VLOPs and the users. Where the first approach is unsatisfactory, the second approach seems problematic in itself because of the private nature of the VLOPs.

The fact that VLOPs do not enforce their terms consistently was once again demonstrated by the recent revelations that different terms and conditions apply depending on the status of the user in question. This goes against the entire concept of terms and conditions and, more importantly, does nothing to enhance the predictability and foreseeability from the perspective of the user. Another relevant and shocking example is the latest revelation by Facebook’s whistleblower Frances Haugen. Supported by internal documents, her accounts detailed how Facebook’s latest amendments to their News Feed feature have a damaging effect on teenagers’ mental health. For example, the company had been aware of the fact that its system leads teenagers to anorexia-related content. As we mentioned in our first blogpost, content related to self-harm, including eating disorders, is regulated in Facebook’s ToUs and CGs. Although Facebook does maintain a clear definition of this category, our research has also demonstrated that there is little certainty as to whether a particular type of content falls within the definition and if so, when the exceptions apply. However, Facebook chose not to implement alternative options for their News Feed because of the effect this might have on “meaningful social interactions”. While vague terms of service lead to uncertainty among users, these examples also illustrate the risk of VLOPS making behind-the-scenes trade-offs between dealing with harmful content and business profits. Both examples are tokens of how companies unfortunately put profit before safety and show the importance of clarifying how these private platforms are bound by fundamental rights in their content moderation practices.

The first step to a solution, or clarification at the least, would therefore be for the EU legislator to elucidate the meaning and scope of the article. As it has been noted, in the absence of further explanations, the provision is in danger of becoming no more than a paper tiger. In a European Parliament IMCO Committee meeting of last 27 September, Article 12 was not specifically discussed; at the same time, it was not mentioned by the rapporteur Christel Schaldemose among the most contentious DSA areas to be addressed in the forthcoming negotiations (these mainly concern the position of SMEs, targeted ads, the regulation of algorithms and product safety).  In any case, the EP’s position on Article 12 will soon become clearer, as in the coming weeks the IMCO Committee will adopt its final opinion on the DSA, which will then be voted upon by the EP in plenary session.

Concluding remarks

All in all, we believe that the DSA is in principle a very promising piece of regulation. It has the potential to ensure digital freedom of expression and the fact that more obligations are imposed on the platforms is to be welcomed. Nonetheless, as is often the case with new regulation, it still has its teething problems. As it is now, we have some doubts on whether Article 12 DSA is capable of solving the problems stemming from the vagueness of the platforms’ CGs and ToUs, especially where the implications of Article 12 are not given appropriate attention in the ongoing DSA debate and negotiations. As the legislative process is still ongoing, only time will tell.

 

Britt van den Branden, Sophie Davidse and Eva Smit are students of the IViR research master in information law and of the “Glushko & Samuelson Information Law and Policy Lab” (ILPLab). For this ILPLab research project, they have worked in partnership with the DSA Observatory and with AWO data rights agency.