What ChatGPT Tells Us about Gender: A Cautionary Tale about Performativity and Gender Biases in AI

被引:56
作者
Gross, Nicole [1 ]
机构
[1] Natl Coll Ireland, Sch Business, Dublin D01Y300, Ireland
来源
SOCIAL SCIENCES-BASEL | 2023年 / 12卷 / 08期
关键词
gender; gender bias; ChatGPT; large language models; generative AI; performativity; ethical AI; STEREOTYPES; GIRLS; PERFORMANCE;
D O I
10.3390/socsci12080435
中图分类号
C [社会科学总论];
学科分类号
03 ; 0303 ;
摘要
Large language models and generative AI, such as ChatGPT, have gained influence over people's personal lives and work since their launch, and are expected to scale even further. While the promises of generative artificial intelligence are compelling, this technology harbors significant biases, including those related to gender. Gender biases create patterns of behavior and stereotypes that put women, men and gender-diverse people at a disadvantage. Gender inequalities and injustices affect society as a whole. As a social practice, gendering is achieved through the repeated citation of rituals, expectations and norms. Shared understandings are often captured in scripts, including those emerging in and from generative AI, which means that gendered views and gender biases get grafted back into social, political and economic life. This paper's central argument is that large language models work performatively, which means that they perpetuate and perhaps even amplify old and non-inclusive understandings of gender. Examples from ChatGPT are used here to illustrate some gender biases in AI. However, this paper also puts forward that AI can work to mitigate biases and act to 'undo gender'.
引用
收藏
页数:15
相关论文
共 80 条
[61]  
Pegoraro Rob., 2023, FASTCOMPANY 0501
[62]   The gender stereotyping of emotions [J].
Plant, EA ;
Hyde, JS ;
Keltner, D ;
Devine, PG .
PSYCHOLOGY OF WOMEN QUARTERLY, 2000, 24 (01) :81-92
[63]   PINK OR BLUE - ENVIRONMENTAL GENDER STEREOTYPES IN THE 1ST 2 YEARS OF LIFE [J].
POMERLEAU, A ;
BOLDUC, D ;
MALCUIT, G ;
COSSETTE, L .
SEX ROLES, 1990, 22 (5-6) :359-367
[64]  
Ray PP, 2023, Internet of Things and Cyber-Physical Systems, V3, P121, DOI [10.1016/j.iotcps.2023.04.003, DOI 10.1016/J.IOTCPS.2023.04.003, 10.1016/j.iotcps.2023.04.003]
[65]  
Rothchild J., 2014, The Blackwell Encyclopedia of Sociology, DOI DOI 10.1002/9781405165518.WBEOSG011.PUB2
[66]   The Political Biases of ChatGPT [J].
Rozado, David .
SOCIAL SCIENCES-BASEL, 2023, 12 (03)
[67]  
Sawers Paul., 2019, VENTUREBEAT 1005
[68]  
Schmidt Sarah., 2023, MARKETRESEARCH 0131
[69]  
Selby Daniele., 2018, GLOBAL CITIZEN 0918
[70]   Situating Search [J].
Shah, Chirag ;
Bender, Emily M. .
CHIIR'22: PROCEEDINGS OF THE 2022 CONFERENCE ON HUMAN INFORMATION INTERACTION AND RETRIEVAL, 2022, :221-232