Large language models, politics, and the functionalization of language
被引:0
作者:
论文数: 引用数:
h-index:
机构:
Olya Kudina
[1
]
Bas de Boer
论文数: 0引用数: 0
h-index: 0
机构:
University of Twente,Faculty of Behavioural, Management and Social Sciences, Philosophy SectionDelft University of Technology,Department of Values, Technology & Innovation, Section on Ethics and Philosophy of Technology
Bas de Boer
[2
]
机构:
[1] Delft University of Technology,Department of Values, Technology & Innovation, Section on Ethics and Philosophy of Technology
[2] University of Twente,Faculty of Behavioural, Management and Social Sciences, Philosophy Section
来源:
AI and Ethics
|
2025年
/
5卷
/
3期
关键词:
Large Language models;
Democracy;
Political de-skilling;
Arendt;
Foucault;
D O I:
10.1007/s43681-024-00564-w
中图分类号:
学科分类号:
摘要:
This paper critically examines the political implications of Large Language Models (LLMs), focusing on the individual and collective ability to engage in political practices. The advent of AI-based chatbots powered by LLMs has sparked debates on their democratic implications. These debates typically focus on how LLMS spread misinformation and thus hinder the evaluative skills of people essential for informed decision-making and deliberation. This paper suggests that beyond the spread of misinformation, the political significance of LLMs extends to the core of political subjectivity and action. It explores how LLMs contribute to political de-skilling by influencing the capacities of critical engagement and collective action. Put differently, we explore how LLMs shape political subjectivity. We draw from Arendt’s distinction between speech and language and Foucault’s work on counter-conduct to articulate in what sense LLMs give rise to political de-skilling, and hence pose a threat to political subjectivity. The paper concludes by considering how to reconcile the impact of LLMs on political agency without succumbing to technological determinism, and by pointing to how the practice of parrhesia enables one to form one’s political subjectivity in relation to LLMs.