What ChatGPT Tells Us about Gender: A Cautionary Tale about Performativity and Gender Biases in AI

被引:55
作者
Gross, Nicole [1 ]
机构
[1] Natl Coll Ireland, Sch Business, Dublin D01Y300, Ireland
来源
SOCIAL SCIENCES-BASEL | 2023年 / 12卷 / 08期
关键词
gender; gender bias; ChatGPT; large language models; generative AI; performativity; ethical AI; STEREOTYPES; GIRLS; PERFORMANCE;
D O I
10.3390/socsci12080435
中图分类号
C [社会科学总论];
学科分类号
03 ; 0303 ;
摘要
Large language models and generative AI, such as ChatGPT, have gained influence over people's personal lives and work since their launch, and are expected to scale even further. While the promises of generative artificial intelligence are compelling, this technology harbors significant biases, including those related to gender. Gender biases create patterns of behavior and stereotypes that put women, men and gender-diverse people at a disadvantage. Gender inequalities and injustices affect society as a whole. As a social practice, gendering is achieved through the repeated citation of rituals, expectations and norms. Shared understandings are often captured in scripts, including those emerging in and from generative AI, which means that gendered views and gender biases get grafted back into social, political and economic life. This paper's central argument is that large language models work performatively, which means that they perpetuate and perhaps even amplify old and non-inclusive understandings of gender. Examples from ChatGPT are used here to illustrate some gender biases in AI. However, this paper also puts forward that AI can work to mitigate biases and act to 'undo gender'.
引用
收藏
页数:15
相关论文
共 80 条
[1]  
[Anonymous], 2020, Gender Equality: Why It Matters
[2]  
[Anonymous], 2023, GENDER STEREOTYPING
[3]  
[Anonymous], 2017, The pursuit of gender equality: An uphill battle, DOI [10.1787/9789264281318-en, DOI 10.1787/9789264281318-EN]
[4]  
Asdal K., 2021, Doing Document Analysis: A Practice-Oriented Method
[5]  
Austin J. L., 1962, How to do things with words, DOI 10.1093/acprof:oSo/9780198245537.001.0001
[6]  
Banchefsky S., 2018, SOCIAL SCI, V7, P27, DOI DOI 10.3390/SOCSCI7020027
[7]   Female teachers' math anxiety affects girls' math achievement [J].
Beilock, Sian L. ;
Gunderson, Elizabeth A. ;
Ramirez, Gerardo ;
Levine, Susan C. .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2010, 107 (05) :1860-1863
[8]   On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? [J].
Bender, Emily M. ;
Gebru, Timnit ;
McMillan-Major, Angelina ;
Shmitchell, Shmargaret .
PROCEEDINGS OF THE 2021 ACM CONFERENCE ON FAIRNESS, ACCOUNTABILITY, AND TRANSPARENCY, FACCT 2021, 2021, :610-623
[9]   PSYCHOLOGY Gender stereotypes about intellectual ability emerge early and influence children's interests [J].
Bian, Lin ;
Leslie, Sarah-Jane ;
Cimpian, Andrei .
SCIENCE, 2017, 355 (6323) :389-+
[10]   Bias in algorithmic filtering and personalization [J].
Bozdag, Engin .
ETHICS AND INFORMATION TECHNOLOGY, 2013, 15 (03) :209-227