Element 68Element 45Element 44Element 63Element 64Element 43Element 41Element 46Element 47Element 69Element 76Element 62Element 61Element 81Element 82Element 50Element 52Element 79Element 79Element 7Element 8Element 73Element 74Element 17Element 16Element 75Element 13Element 12Element 14Element 15Element 31Element 32Element 59Element 58Element 71Element 70Element 88Element 88Element 56Element 57Element 54Element 55Element 18Element 20Element 23Element 65Element 21Element 22iconsiconsElement 83iconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsElement 84iconsiconsElement 36Element 35Element 1Element 27Element 28Element 30Element 29Element 24Element 25Element 2Element 1Element 66

MEDIA RESEARCH BLOG

Blog Header Background Image
Zur Übersicht
The DSA highlights codes of conduct: What can we learn from their similarities and differences to GDPR codes?

The DSA highlights codes of conduct: What can we learn from their similarities and differences to GDPR codes?

08.03.2023

The Digital Services Act (DSA) encourages codes of conduct to be developed with the purpose of “tackling different types of illegal content and systemic risks”, or for the specific purpose of online advertising or accessibility. This approach follows in the footsteps of the General Data Protection Regulation (GDPR). So what can we learn from those codes?
By Carl Vander Maelen
 
The European Union has debuted a string of legislative instruments to regulate different aspects of the information and communications technology (ICT) sector. The General Data Protection Regulation (GDPR) entered into application in 2018 and renewed the EU’s data protection framework; the proposal for an Artificial Intelligence Act (AIA) seeks to mitigate the risks posed by AI technologies; and the Digital Services Act (DSA) regulates the content offered on digital platforms.
 
While diverse in goals, scholars have noted that recent instruments seem to follow the template laid out by the GDPR, a phenomenon dubbed ‘GDPR mimesis’.[1] A striking example thereof is how articles 45-47 DSA encourage the development of codes of conduct, with several elements being reminiscent of articles 40-41 GDPR and its own call for codes.
 
How similar is the DSA’s approach to codes of conduct to the GDPR’s approach? And what lessons can be learned from the GDPR’s successes and failures? Two elements are immediately pertinent.
Tension between soft and hard approaches
First, the interactions between codes of conduct and the instrument they are embedded in merit discussion. The GDPR clearly situates codes as secondary instruments vis-à-vis the GDPR as the primary instrument. After all, codes are “intended to contribute to the proper application” of the GDPR (article 40.1 GDPR) and have “the purpose of specifying” its application (article 40.2 GDPR). To that end:
  • codes may be “used as an element by which to demonstrate compliance” as found in:
    • Article 24.3 GDPR (obligations of the controller)
    • Article 28.5 GDPR (sufficient guarantees by processors to implement appropriate technical and organisational measures)
    • Article 32.3 GDPR (level of security appropriate to the risk)
  • “[c]ompliance with approved codes […] shall be taken into due account in assessing the impact of the processing operations” for a DPIA (article 35.8 GDPR)
  • “[w]hen deciding whether to impose an administrative fine and deciding on the amount of the administrative fine […] due regard shall be given to” adherence to codes (article 83.2.j GDPR)
Questions have been raised about how the explicitly broad and voluntary language usually associated with soft law, is replaced in the GDPR by precise and more seemingly binding stipulations typically found in hard law.[2] Some scholars speak of a ‘hardening’ of soft law instruments[3] or ‘juridification’.[4]
 
This tension between a soft and a hard approach also runs throughout the DSA and its codes. On the one hand, the DSA and the Commission go to great lengths to assure the voluntary, self-regulatory nature of codes under the DSA. See the mentions of codes being “voluntary” tools in Recitals 98, and articles 45.1 and 46.1 DSA, or in point ‘h’ of the Preamble to the 2022 Strengthened Code of Practice on Disinformation (for more on this code, see below). On the other hand, the DSA takes a very clear top-down approach. Recital 104 notes that “[t]he refusal without proper explanations […] of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account” when determining whether there was an infringement of the DSA. Recital 103 DSA contains this tension within one provision, speaking of codes’ voluntary nature and parties’ freedom to decide whether to participate – while also stressing the importance of cooperating in developing and adhering to specific codes. Such an intertwinement of soft and hard approaches “questions the extent to which a platform could abandon the commitments it has voluntarily made”.[5]
Uptake and experimental temporality
Second, the reality of code development must be addressed. The GDPR provides a sobering view with only two approved transnational codes, almost five years after its entry into application,[6] and not a single approved ‘code having general validity’ for data transfers to third countries (article 40.3 juncto 46.2.e GDPR). While there are a number of codes adopted at the national level, the strict requirements for monitoring laid down by the GDPR and the European Data Protection Board (EDPB)[7] are noted as particular sore points that see code developers walk away from the development halfway through or near the end of the process.[8]
 
In contrast, the DSA has not yet entered into application but (potential) codes already exist. The ‘Code of Practice on Disinformation’ was originally unveiled in 2018, and later updated to the ‘2022 Strengthened Code of Practice on Disinformation’. It explicitly states in point i of its preamble that it “aims to become a Code of Conduct under Article 35 [ed: article 45 in the final text] of the DSA, after entry into force”.[9] The 2016 ‘Code of Conduct on Countering Illegal Hate Speech Online’ similarly treats topics relevant to the DSA, long before the DSA’s final text was approved. The European Commission reported at the end of 2022 that it “will discuss with the IT companies how to ensure that the implementation of the Code supports compliance with the DSA […]. This process may lead to a revision”.[10]
 
This seems to imply, then, a similar trajectory whereby the Code may be revised and slotted into the DSA’s framework. At the time of writing, work is also underway on the ‘EU Code of conduct on age-appropriate design’. Although its drafting and monitoring process seems to follow a slightly different approach (due to the establishment of a specific expert group), the European Commission similarly mentions that the code “will build on the regulatory framework provided in the [DSA] and assist with its implementation”.[11]
A recipe for success?
The availability of codes of conduct under the DSA therefore seems guaranteed, although some questions could be raised about the transparency of the process and the temporal logic of their development. These concerns go beyond theory. Ex-post assessments of the 2018 ‘Code of Practice on Disinformation’ recommended that there should be “a shift from the current flexible self-regulatory approach to a more co-regulatory one”,[12] which was realised in 2022. Remarkably, however, stakeholders already complained that the initial code was a “government-initiated ‘self-regulatory’ instrument” that did not genuinely engage with stakeholders.[13] The 2016 ‘Code of Conduct on Countering Illegal Hate Speech Online’ was similarly reported to be developed “at the behest of the European Commission under the threat of introducing statutory regulation” with the ‘systematic exclusion’ of civil society groups.[14]
 
The tension between a soft and a hard approach clearly manifests, and the waters are further muddled due to the unusual temporal approach whereby codes were developed before the DSA’s final text had even been approved. Since the DSA – and by extension its codes of conduct – deals with fundamental societal issues such as discrimination, social inequality, and disinformation, it is crucial to involve societal stakeholders correctly. The stakes have never been higher.
 
[1] Vagelis Papakonstantinou and Paul De Hert, ‘Post GDPR EU Laws and Their GDPR Mimesis. DGA, DSA, DMA and the EU Regulation of AI’ (European Law Blog, 1 April 2021) <https://europeanlawblog.eu/2021/04/01/post-gdpr-eu-laws-and-their-gdpr-mimesis-dga-dsa-dma-and-the-eu-regulation-of-ai/> accessed 18 January 2022.
[2] Carl Vander Maelen, ‘Hardly Law or Hard Law? Investigating the Dimensions of Functionality and Legalisation of Codes of Conduct in Recent EU Legislation and the Normative Repercussions Thereof’ (2022) 47 European Law Review 752.
[3] E. Traversa and A. Flamini, ‘Fighting Harmful Tax Competition through EU State Aid Law: Will the Hardening of Soft Law Suffice?’ (2015) 14 European State Aid Law Quarterly 323.
[4] A. Beckers, ‘The Creeping Juridification of the Code of Conduct for Business Taxation: How EU Codes of Conduct Become Hard Law’ (2018) 37 Yearbook of European Law 569.
[5] Ronan Fahy, Naomi Appelman and Natali Helberger, ‘The EU’s regulatory push against disinformation: What happens if platforms refuse to cooperate?’ (Verfassungsblog, 5 August 2022) <https://verfassungsblog.de/voluntary-disinfo/> accessed 10 February 2023.
[6] Carl Vander Maelen, ‘First of Many? First GDPR Transnational Code of Conduct Officially Approved After EDPB Opinions 16/2021 and 17/2021’ (2021) 7 European Data Protection Law Review 228.
[7] European Data Protection Board, ‘Guidelines 1/2019 on Codes of Conduct and Monitoring Bodies under Regulation 2016/679 - Version 2.0 (Version Adopted after Public Consultation)’ (4 June 2019).
[8] Evert-Ben van Veen, ‘Unwarranted Requirement of an Accredited External Monitoring Body Hampers Establishing Codes of Conduct’ (MLC Foundation, 10 September 2022) <https://mlcf.eu/en/unwarranted-requirement-of-an-accredited-external-monitoring-body-hampers-establishing-codes-of-conduct/>. Although it would lead this current blog post too far, it must be noted that the DSA takes an entirely different approach to monitoring that is driven heavily by the European Commission and the European Board for Digital Services (see for example article 45.4 DSA).
[9] European Commission, ‘The Strengthened Code of Practice on Disinformation 2022’ (2022) 2 <https://ec.europa.eu/newsroom/dae/redirection/document/87585>.
[10] European Commission, ‘EU Code of Conduct against Online Hate Speech: Latest Evaluation Shows Slowdown in Progress’ (24 November 2022) 1 <https://ec.europa.eu/commission/presscorner/api/files/document/print/en/ip_22_7109/IP_22_7109_EN.pdf>.
[11] European Commission, ‘Special Group on the EU Code of Conduct on Age-Appropriate Design’ <https://digital-strategy.ec.europa.eu/en/policies/group-age-appropriate-design> accessed 1 March 2023.
[12] European Regulators Group for Audiovisual Media Services, ‘ERGA Report on Disinformation: Assessment of the Implementation of the Code of Practice’ (May 2020) 52.
[13] P. H. Chase, ‘The EU Code of Practice on Disinformation: The Difficulty of Regulating a Nebulous Problem’ (Transatlantic Working Group on Content Moderation Online and Freedom of Expression, 29 August 2019) 1. See also pages 5 and 9.
[14] B. Bukovská, ‘The European Commission’s Code of Conduct for Countering Illegal Hate Speech Online: An Analysis of Freedom of Expression Implications’ (Transatlantic High Level Working Group on Content Moderation Online and Freedom of Expression, 7 May 2019) 3–4; see also similar comments in: N. Alkiviadou, ‘Hate Speech on Social Media Networks: Towards a Regulatory Framework?’ (2019) 28 Information & Communications Technology Law 19, 31.

Title photo: Sam Pak / unsplash

Weitere Artikel

mehr anzeigen

Newsletter

Infos über aktuelle Projekte, Veranstaltungen und Publikationen des Instituts.

NEWSLETTER ABONNIEREN!