The DSA proposal and Poland
Lidia Dutkiewicz, Jan Czarnocki
(Center for IT and IP law – CiTiP, KU Leuven)
Disclaimer: Dear reader, please note that this commentary was published before the DSA was finalised and is therefore based on an outdated version of the DSA draft proposal. The DSA’s final text, which can be here, differs in numerous ways including a revised numbering for many of its articles.
The position of Poland on the DSA proposal
The Polish government’s proposals for amendment of the DSA proposal have been relatively significant. In its position published on 14 April 2021, the Polish government points out to the following key recommendations which should be included in the DSA. First, the Polish government seeks more competences for national regulators (Digital Service Coordinators), so they can have “a real impact” on proceedings concerning user complaints against service providers. In its recent position expressed toward the Council, the Polish government proposes adding a new point in Art. 8. According to this new provision, the Digital Services Coordinator of each Member State, on its own initiative, within 72 hours of receiving the copy of the order to act, should have the right to scrutinize the order to determine whether it “seriously or manifestly infringes” the respective Member States’ law and should have the right to revoke the order or make it ineffective/not applicable on its own territory. Second, content moderation by large platforms should take into account as much as possible “the socio-cultural context of the user’s country”. The government notes that given the importance of large social networks to public debate, their power to determine what speech is acceptable does not really provide a guarantee of freedom of expression. To that end, the user must have the right to a fast-track judicial appeal procedure against the platform’s decision directly to the national court in the country in which he lives or resides. This right must not be conditional on the use of platform’s internal complaint handling mechanism. Third, in its position expressed toward the Council, the Polish government seeks a more effective mechanism for VLOPS to acts against harmful content, including disinformation. In particular, VLOPS “should” undertake actions to mitigate illegal and harmful content in the services they provide. Forth, in its position expressed toward the Council, the Polish government adds that in order to “counteract the arbitrary and unjustified isolation of certain actors” from the access to information, the DSA should also provide for the possibility to issue an order to restore access to content. Fifth, the Polish government is in favour of maintaining no general monitoring obligation. It favours rules which reconcile proactivity of platforms in searching for and removing illegal content with their simultaneous protection against assigning to them automatic responsibility for content placed on their services by third parties. The Polish government supports the introduction of the so-called Good Samaritan clause. Finally, in line with its position of 14 April 2021, the Polish government underlines that what is illegal in one Member State may be legal in another and this should be reflected in the orders issued based on Article 8 DSA. Therefore, the DSA should prevent issuing Europe-wide takedown orders if they infringe the national law of only some Member States.
The broader legal, political and institutional context
In Poland, discussion on platform regulation, and in particular, content moderation policies have been ongoing for several years. In November 2016, Poland’s then-digital economy minister accused Facebook of “censorship” after the platform suspended some users’ profiles for using symbols related to fascism, dating to the 1930s. Nationalist-libertarian party “Konfederacja” held protests in front of the Facebook headquarters in Warsaw in 2016.
In October 2017, in the interview for the Polish Press Agency, Anna Streżyńska, the then-minister of Digitization announced that the Ministry works on a new law which would “limit hate speech and manipulation on the Internet”. According to Streżyńska, the moral standards are not always sufficient to maintain public order, and the Internet requires to be regulated to eliminate these threats. To this end, the main aim of the new law would be the return of “true freedom of speech” on the Internet. Yet, the draft law has not been published.
The discussion on the alleged “censorship” by Facebook of the far-right symbols, including symbols related to fascism, continued till April 2018. Then, Facebook decided to remove all posts promoting the National Radical Camp (Polish: Obóz Narodowo-Radykalny, ONR) and the National Rebirth of Poland (Polish: Narodowe Odrodzenie Polski, NOP), and the use of symbols used by both organisations in their posts. Facebook argued that it forbids spreading hatred and that it does not accept groups or organisations that “promote violence or attack people based on characteristics protected by law”. Facebook pointed out that the ONR and the NOP “consistently violated its rules by openly promoting racist, anti-Semitic and homophobic views”. In this context, it is important to note that, as observed by the Council of Europe Commissioner for Human Rights, Poland’s criminal law does not prohibit hate speech or hate crime on grounds of sexual orientation or gender identity. Such speech is not criminalized, and therefore not illegal. Therefore, it remains an important issue of contention that platforms remove the transphobic or homophobic content which violates their community standards, but is not illegal under Polish law.
In the aftermath, in November 2018, the Polish government and Facebook have signed a Memorandum of Understanding. According to then-Minister of Digitization, Marek Zagórski, the agreement “supports the protection of freedom of speech on the Internet” and gives Facebook users an additional instance of appeal against removed content. This new instance, or ‘point of contact’ was launched on 14 December 2018, and through it, under certain conditions, Facebook users can challenge Facebook’s decisions to delete their content, accounts or profiles. EDRI has reported on this issue here. In his response to a parliamentary question asked by a Polish MP Monika Rosa, Zagórski informed that from December 2018 to May 26 2020, there were 2,168 requests to restore deleted profiles or content, of which, 605 were positively resolved by Facebook.
With regard to online platforms Poland had its own precursors to Facebook. For example, a social network platform originally known as Nasza Klasa (English: Our Class) before becoming NK.pl. Founded and launched in November 2006 by four computer science students from the University of Wrocław, Nasza Klasa was seen as a way to reconnect with school friends. At its peak the platform had 13,5 million active users and was visited by more than 50% of Polish Internet users each month. In July 2021, Nasza Klasa closed after 15 years.
Another example is wykop.pl (English: dig out), a Polish social network site founded in 2005 and modeled after the US Reddit and digg.com website. Recently, the scandal broke when Jagoda Grondecka, a Polish journalist reporting on the Taliban’s takeover of Afghanistan, revealed the wykop’s thread with hate speech and criminal threats towards her. As shown in the wykop’ archives, despite reporting by its users wykop’ moderators systematically allow for racist, homophobic and anti-Semitic content. The lack of reaction against the promotion of violence, pedophile materials and – popular among Polish teenagers – “patostreaming” show similarities between wykop and an infamous 4chan platform.
Finally, not long after Donald Trump’s deplatforming, on 20 January 2021, Poland’s new social networking site, Albicla was set up by the right-wing Gazeta Polska‘s editor-in-chief, Tomasz Sakiewicz. Albicla – an abbreviation of the term “Let All Be Clear” – was meant to become “a new response to online censorship”. At the time of writing, Albicla still runs, although with only 76 000 registered users, its popularity is minimal. As reported by the press, this might be due to technical and security issues, including data breaches and a source code leak.
As regards civil society organizations active in these areas, the Panoptykon Foundation, a member of EDRI, was established in April 2009. It aims to protect human rights against surveillance, through watchdog, advocacy and education activities. It has prepared a series of analyses concerning Internet regulation, making use of CCTV or telecommunications data by public institutions. Panoptykon has also played an important role by supporting the Civil Society Drug Policy Initiative (SIN), a Polish non-profit organisation promoting evidence-based drug policy, in a “landmark censorship lawsuit” against Facebook. The case concerns the alleged non-transparent and arbitrary removal of drugs-related educational content from the organisation’s accounts on Facebook and Instagram in 2018 and 2019. SIN argued that the blocking of their content, based on the violation of Facebook’s Community Standards, unjustifiably restricted their right to disseminate information, to express opinion and to communicate with users. In July 2021, the District Court in Warsaw (Appellate Division) upheld its interim measures ruling from 2019 in which it temporarily prohibited Facebook from removing SIN’s pages and posts on Facebook and Instagram. The decision is now final. Centrum Cyfrowe, part of the COMMUNIA association is another important NGO working on the free access to knowledge and culture, freedom of speech and creative expression. Its main focus is on copyright, copyright for remote education and open culture. Centrum Cyfrowe has also voiced its criticism towards content filtering in Article 17 of the Copyright in the Digital Single Market Directive and in the Terrorist Content Online Regulation.
Recent legal and political developments
On 17 December, the Minister of Justice and the Public Prosecutor General, Zbigniew Ziobro presented a plan for proposal for a “breakthrough law on freedom of expression on the Internet”. The Ministry of Justice announced that the new provisions will “effectively implement the constitutional right of freedom of expression and help protect against fake news on the Internet”. During the press conference, the Minister stressed that the content posted on online platforms is often subject to “unwanted interference” and is deleted, even when not contrary to Polish law. Moreover, often, the “victims of ideologically motivated censorship” are members of religious and right-wing groups whose content is deleted or blocked on the Internet. The Secretary of State Sebastian Kaleta added that “it is time for Poland to set up regulations shielding the freedom of expression on the Internet from the abuses of corporate giants.” Kaleta has also stressed that provisions in effect in other countries, mainly in Germany and France, as well as the DSA proposal, “stress the prompt removal of content considered to infringe national law rather than protecting the freedom of expression” and therefore, the nature of such regulations “is primarily repressive”. To that end, Poland wants to adopt its own regulations to “successfully protect the constitutional right to freedom of expression”.
Not long after Donald Trump’s deplatforming, on 13 January 2021, Konfederacja, a far-right nationalist-libertarian party presented its “social media freedom of expression package”. The “package” foresees, among others, “terms & conditions compliant with Polish law”. Moreover, it obliges the platforms to justify decisions to remove posts and profiles and to offer the possibility to appeal to a national court, which should decide on the issue within 48h in electronic proceedings.
On 1 February 2021, the Ministry of Justice has finally published a text of a draft law “on the protection of freedom of expression in online social networks”. Art. 14 of the draft law proposes the so-called Council for Freedom of Speech, a public administrative body, which would watch over the validity of content moderation decisions of social networks. As argued by Panoptykon, the current text guarantees neither the independence nor the expertise of this body and allows the members of the ruling party to sit on the Council. Chapter 4 of the draft act obligates platforms to conduct an internal review procedure for users’ complaints concerning the restriction of access to their content, user profile and dissemination of “unlawful content”. The platform is obliged to respond within 48 hours. A user dissatisfied with the way the complaint has been handled in the internal review procedure may file a complaint with the Council. The Council can then issue a decision in which it orders the restoration of restricted access to the content or the user profile if it finds that the content or the user profile to which access has been restricted do not constitute an “unlawful content”. Art. 3(8) provides a broad definition of “unlawful content”, which also includes illegal content (i.e. a content of a criminal nature). An “unlawful content” means a content that violates personal rights, “disinformation” (defined in Art. 3(6)), content of a criminal nature, as well as content that violates “good morals/decency”, in particular if it disseminates or advocates “violence, suffering or humiliation”. The draft law proposes the possibility of appealing the Council’s decision to an administrative court. According to the draft law, the online platforms will also have to provide public authorities with access to any relevant information regarding proceedings concerning content moderation decisions. Moreover, the draft law introduces mandatory 12 months data retention for the companies providing electronic services which raises questions regarding its compatibility with the the Court of Justice of the European Union (CJEU) case law (e.g. in cases 511/18 and C 512/18 La Quadrature du Net).
On 29 September 2021, the draft law has finally appeared in the official list of legislative work. However, at the moment, it is not very likely that the draft law will enter into force any time soon. It is due to the fragile political position of the Minister of Justice, which represents the Solidarna Polska party – the junior member of a governing coalition. According to press reports, there is no agreement between the Solidarna Polska and the Law and Justice party.
In addition, the left-wing parliamentary club also came up with an initiative to regulate platforms at the national level, presenting a draft law on access to digital content. The draft addresses issues that are already covered by the DSA proposal. The draft also regulated “harmful”, but not illegal content, that is content which is “manifestly contrary to socially acceptable norms”, “inappropriate for a specific group of user group” or “false content”. Given the political situation in Poland, it is unlikely that the draft law will be proceeded.
Despite all the talk, it seems that major political forces in Poland do not consider the issue of content moderation as their priority. It is not likely that Poland adopts its own content moderation law in the nearest future.
Another area of legal and political development in Poland has been related to the Copyright in the Digital Single Market Directive (DSM Directive). On 24 May 2019, Poland brought an application to the CJEU seeking annulment of the Article 17(4)(b) and (c) of the DSM Directive concerning the liability of “online content-sharing service providers” for content uploaded by users (case C-401/19). In its application, Poland argues that imposing the obligation on the platforms to make best efforts to prevent future uploads of specific works, in fact amounts to automatic filtering of the user-uploaded content. According to the application, such “preventive control” mechanisms would “undermine the essence of the right of freedom of expression and information” and “do not comply with the requirement that limitations imposed on that right be proportional and necessary”. In the words of deputy prime minister Piotr Gliński “this leads to the introduction of solutions that have the features of preventive censorship”. On 15 July 2021, Advocate General’s Opinion has been issued.
Lidia Dutkiewicz is a research associate at the Center for IT and IP law (CiTiP), KU Leuven. She conducts research on content moderation, recommender systems and artificial intelligence. She is currently involved in the Horizon 2020 project AI4Media (ICT-48-2020) which aims at developing ethical and trustworthy AI technology for a European media landscape..
Jan Czarnocki is a doctoral researcher at the Center for IT and IP law (CiTiP), KU Leuven. He holds an LL.M degree in comparative law from the China University of Political Science and Law in Beijing and a master’s degree in law from the University of Warsaw. He was as a trainee in the External Policies Directorate of the European People’s Party Group in the European Parliament and a “European View” editor-intern in the Wilfried Martens Centre for the European Studies.
This contribution is part of an independent research project on mapping the position of and situation in the Member States with respect to the Digital Services Act (DSA), for which the DSA Observatory has received a (partial) financial contribution from our partner in this project, EDRi.
Photo by Maksym Harbar on Unsplash