Governing Digital Spaces: Addressing Illegal and Harmful User-Generated Content on Online Platforms

被引:0
作者
Skocir, Manja [1 ,2 ]
机构
[1] Univ Ljubljana, Inst Criminol, Fac Law Ljubljana, Ljubljana, Slovenia
[2] Univ Ljubljana, Fac Law, Ljubljana, Slovenia
来源
REVIJA ZA KRIMINALISTIKO IN KRIMINOLOGIJO | 2024年 / 75卷 / 04期
关键词
user-generated content; Digital Services Act; platform regulation; intermediary liability; content moderation; BODY-IMAGE CONCERNS; SOCIAL MEDIA;
D O I
暂无
中图分类号
DF [法律]; D9 [法律];
学科分类号
0301 ;
摘要
The article analyses the European Union's legal framework regulating user-generated content on social media platforms. It begins by presenting the development of content moderation practices on social media platforms and describes the shift from platform self-regulation of user-generated content to more structured legal frameworks. The key focus is on whether the existing legislative framework adequately addresses the social risks arising from user-generated content. The article demonstrates that while the European Union's Digital Services Act (2022/2065) imposes obligations on platforms regarding the removal of illegal user-generated content, the regulation of harmful but legal content remains insufficient, as the moderation of such content is largely left to the platforms themselves. This highlights a gap in the European Union's approach to online safety, particularly considering the unique characteristics of the online environment, where the potential for harm is often amplified in ways that significantly differ from the offline world. In conclusion, the article emphasises the need for a more robust regulatory framework that goes beyond merely aligning online regulations with offline norms. It questions whether the principle that 'what is illegal offline must also be illegal online' adequately addresses the complexity of the digital environment. The article suggests that future regulations should adopt a harm assessment methodology that allows for the proper evaluation of the consequences of harmful but legal content. It stresses the particular importance of focusing on reducing the risks posed by the algorithmic amplification of specific content (which reveals that intermediaries play more than just a neutral role) and highlights the need to acknowledge the broader societal impacts of harmful user-generated content, including harm to third parties.
引用
收藏
页数:91
相关论文
共 58 条
  • [1] Angelopoulos C., 2016, European intermediary liability in copyright. A Tort-Based Analysis
  • [2] [Anonymous], 2017, HITECH act enforcement interim final rule
  • [3] [Anonymous], 1996, Communications Decency Act
  • [4] [Anonymous], 2024, DECISION OPENING PRO, V265, P1
  • [5] [Anonymous], 1998, DIGITAL MILLENNIUM C
  • [6] [Anonymous], 2022, The 2022 Code of Practice on Disinformation
  • [7] [Anonymous], 2019, OJ L, V130, P92
  • [8] Detecting Harmful Content on Online Platforms: What Platforms Need vs. Where Research Efforts Go
    Arora, Arnav
    Nakov, Preslav
    Hardalov, Momchil
    Sarwar, Sheikh Muhammad
    Nayak, Vibha
    Dinkov, Yoan
    Zlatkova, Dimitrina
    Dent, Kyle
    Bhatawdekar, Ameya
    Bouchard, Guillaume
    Augenstein, Isabelle
    [J]. ACM COMPUTING SURVEYS, 2024, 56 (03)
  • [9] Barlow J.P., 2019, Duke Law Technology Review, V18, P5
  • [10] Bliss L., 2022, The Conversation22. 3.