Element 68Element 45Element 44Element 63Element 64Element 43Element 41Element 46Element 47Element 69Element 76Element 62Element 61Element 81Element 82Element 50Element 52Element 79Element 79Element 7Element 8Element 73Element 74Element 17Element 16Element 75Element 13Element 12Element 14Element 15Element 31Element 32Element 59Element 58Element 71Element 70Element 88Element 88Element 56Element 57Element 54Element 55Element 18Element 20Element 23Element 65Element 21Element 22iconsiconsElement 83iconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsElement 84iconsiconsElement 36Element 35Element 1Element 27Element 28Element 30Element 29Element 24Element 25Element 2Element 1Element 66

Upload-Filters: Bypassing Classical Concepts of Censorship?

Upload-Filters: Bypassing Classical Concepts of Censorship?

Amélie Pia Heldt veröffentlichte in der Jubiläumsausgabe des JIPITEC (Journal of Intellectual Property, Information Technology and Electronic Commerce Law) „10 Years of JIPITEC“ den Aufsatz „Upload-Filters: Bypassing Classical Concepts of Censorship?“
 

Protecting human rights in the context of automated decision-making might not be limited to the relationship between intermediaries and their users. In fact, in order to adequately address human rights issues vis-à-vis social media platforms, we need to include the state as an actor too. In the German and European human rights frameworks, fundamental rights are in principle only applicable vertically, that is, between the state and the citizen. Where does that leave the right of freedom of expression when user-generated content is deleted by intermediaries on the basis of an agreement with a public authority? We must address this question in light of the use of artificial intelligence to moderate online speech and its (until now lacking) regulatory framework. When states create incentives for private actors to delete user-content pro-actively, is it still accurate to solely examine the relationship between platforms and users? Are we facing an expansion of collateral censorship? Is the usage of soft law instruments, such as codes of conduct, enhancing the protection of third parties or is it rather an opaque instrument that tends to be conflated with policy laundering? This paper aims to analyse the different layers of the usage of artificial intelligence by platforms, when it is triggered by a non-regulatory mode of governance. In light of the ongoing struggle in content moderation to balance between freedom of speech and other legal interests, it is necessary to analyse whether or not intelligent technologies could meet the requirements of freedom of speech and information to a sufficient degree.
 
Der vollständige Artikel ist hier zu lesen
Die vollständige Ausgabe des JIPITEC 10 (1) 2019 als pdf (1,8 MB).

Amélie Pia Heldt, Upload-Filters: Bypassing Classical Concepts of Censorship?, 10 (2019) JIPITEC 56 para 1, https://www.jipitec.eu/issues/jipitec-10-1-2019/4877, 09.05.2019.

Upload-Filters: Bypassing Classical Concepts of Censorship?

Amélie Pia Heldt veröffentlichte in der Jubiläumsausgabe des JIPITEC (Journal of Intellectual Property, Information Technology and Electronic Commerce Law) „10 Years of JIPITEC“ den Aufsatz „Upload-Filters: Bypassing Classical Concepts of Censorship?“
 

Protecting human rights in the context of automated decision-making might not be limited to the relationship between intermediaries and their users. In fact, in order to adequately address human rights issues vis-à-vis social media platforms, we need to include the state as an actor too. In the German and European human rights frameworks, fundamental rights are in principle only applicable vertically, that is, between the state and the citizen. Where does that leave the right of freedom of expression when user-generated content is deleted by intermediaries on the basis of an agreement with a public authority? We must address this question in light of the use of artificial intelligence to moderate online speech and its (until now lacking) regulatory framework. When states create incentives for private actors to delete user-content pro-actively, is it still accurate to solely examine the relationship between platforms and users? Are we facing an expansion of collateral censorship? Is the usage of soft law instruments, such as codes of conduct, enhancing the protection of third parties or is it rather an opaque instrument that tends to be conflated with policy laundering? This paper aims to analyse the different layers of the usage of artificial intelligence by platforms, when it is triggered by a non-regulatory mode of governance. In light of the ongoing struggle in content moderation to balance between freedom of speech and other legal interests, it is necessary to analyse whether or not intelligent technologies could meet the requirements of freedom of speech and information to a sufficient degree.
 
Der vollständige Artikel ist hier zu lesen
Die vollständige Ausgabe des JIPITEC 10 (1) 2019 als pdf (1,8 MB).

Amélie Pia Heldt, Upload-Filters: Bypassing Classical Concepts of Censorship?, 10 (2019) JIPITEC 56 para 1, https://www.jipitec.eu/issues/jipitec-10-1-2019/4877, 09.05.2019.

Infos zur Publikation

Erscheinungsjahr

2019

ÄHNLICHE PUBLIKATIONEN UND VERWANDTE PROJEKTE

Newsletter

Infos über aktuelle Projekte, Veranstaltungen und Publikationen des Instituts.

NEWSLETTER ABONNIEREN!