THE INVISIBLE POLITICAL OFFICER: HOW PERSONALIZATION ALGORITHMS SHAPE PUBLIC OPINION

被引:0
作者
Toloknev, K. A. [1 ]
机构
[1] HSE Univ, Doctoral Sch Polit Sci, Moscow, Russia
来源
POLITEIA-JOURNAL OF POLITICAL THEORY POLITICAL PHILOSOPHY AND SOCIOLOGY OF POLITICS | 2022年 / 107卷 / 04期
关键词
echo chamber; filter bubble; social media; computational mod-eling; agent-based model; public opinion; political communications; FILTER BUBBLES; ECHO CHAMBERS; SOCIAL MEDIA; NEWS; EXPOSURE;
D O I
10.30570/2078-5089-2022-107-4-63-82
中图分类号
D0 [政治学、政治理论];
学科分类号
0302 ; 030201 ;
摘要
Social media have been firmly entrenched in the modern eve-ryday life. Still, their influence on the formation of public opinion is not well understood. An important feature of social media is that they are not neutral. Not only do people interact with each other on social media platforms, but social media themselves actively interact with people, selecting personalized content for them based on the information about their interests and behavior. In 2011, Eli Pariser hypothesized that content personalization should lead to the formation of a kind of "information cocoons", or "filter bubbles" - ho-mogeneous groups of users who hold similar views. However, the fragmen-tation of the Internet community into "filter bubbles" is not the only threat posed by the use of personalization algorithms. Even more dangerously, social media possess the ability to manipulate content selection algorithms in order to influence users' views.
引用
收藏
页码:63 / 82
页数:20
相关论文
共 27 条
  • [1] [Ахременко Андрей Сергеевич Akhremenko A.S.], 2021, [Политическая наука, Political Science (RU), Politicheskaya nauka], P12, DOI 10.31249/poln/2021.01.01
  • [2] The Role of Online Misinformation and Fake News in Ideological Polarization: Barriers, Catalysts, and Implications
    Au, Cheuk Hang
    Ho, Kevin K. W.
    Chiu, Dickson K. W.
    [J]. INFORMATION SYSTEMS FRONTIERS, 2022, 24 (04) : 1331 - 1354
  • [3] Exposure to ideologically diverse news and opinion on Facebook
    Bakshy, Eytan
    Messing, Solomon
    Adamic, Lada A.
    [J]. SCIENCE, 2015, 348 (6239) : 1130 - 1132
  • [4] Exposure to Political Disagreement in Social Media Versus Face-to-Face and Anonymous Online Settings
    Barnidge, Matthew
    [J]. POLITICAL COMMUNICATION, 2017, 34 (02) : 302 - 321
  • [5] Should we worry about filter bubbles?
    Borgesius, Frederik J. Zuiderveen
    Trilling, Damian
    Moller, Judith
    Bodo, Balazs
    de Vreese, Claes H.
    Helberger, Natali
    [J]. INTERNET POLICY REVIEW, 2016, 5 (01):
  • [6] Breaking the filter bubble: democracy and design
    Bozdag, Engin
    van den Hoven, Jeroen
    [J]. ETHICS AND INFORMATION TECHNOLOGY, 2016, 17 (04) : 249 - 265
  • [7] Filter bubble
    Bruns, Axel
    [J]. INTERNET POLICY REVIEW, 2019, 8 (04):
  • [8] Digital Technologies and Selective Exposure: How Choice and Filter Bubbles Shape News Media Exposure
    Cardenal, Ana S.
    Aguilar-Paredes, Carlos
    Galais, Carol
    Perez-Montoro, Mario
    [J]. INTERNATIONAL JOURNAL OF PRESS-POLITICS, 2019, 24 (04) : 465 - 486
  • [9] How Algorithmic Confounding in Recommendation Systems Increases Homogeneity and Decreases Utility
    Chaney, Allison J. B.
    Stewart, Brandon M.
    Engelhardt, Barbara E.
    [J]. 12TH ACM CONFERENCE ON RECOMMENDER SYSTEMS (RECSYS), 2018, : 224 - 232
  • [10] Neutral bots probe political bias on social media
    Chen, Wen
    Pacheco, Diogo
    Yang, Kai-Cheng
    Menczer, Filippo
    [J]. NATURE COMMUNICATIONS, 2021, 12 (01)