"HIV Stigma Exists" - Exploring ChatGPT's HIV Advice by Race and Ethnicity, Sexual Orientation, and Gender Identity

被引:0
作者
Criss, Shaniece [1 ]
Nguyen, Thu T. [2 ]
Gonzales, Sarah M. [1 ]
Lin, Brian [3 ]
Kim, Melanie [2 ]
Makres, Katrina [2 ]
Sorial, Botamina M. [1 ]
Xiong, Yajie [4 ]
Dennard, Elizabeth [2 ]
Merchant, Junaid S. [2 ]
Hswen, Yulin [5 ]
机构
[1] Furman Univ, Hlth Sci, Greenville, SC 29613 USA
[2] Univ Maryland, Sch Publ Hlth Epidemiol & Biostat, College Pk, MD USA
[3] Harvard Univ, Comp Sci, Cambridge, MA USA
[4] Univ Maryland, Dept Sociol, College Pk, MD USA
[5] Univ Calif San Francisco, Computat Hlth Sci Inst, Dept Epidemiol & Biostat, San Francisco, CA USA
关键词
ChatGPT; HIV; Ethnicity; Race; Sexual orientation; Gender identity; Stigma; Discrimnation; DISPARITIES;
D O I
10.1007/s40615-024-02162-2
中图分类号
R1 [预防医学、卫生学];
学科分类号
1004 ; 120402 ;
摘要
BackgroundStigma and discrimination are associated with HIV persistence. Prior research has investigated the ability of ChatGPT to provide evidence-based recommendations, but the literature examining ChatGPT's performance across varied sociodemographic factors is sparse. The aim of this study is to understand how ChatGPT 3.5 and 4.0 provide HIV-related guidance related to race and ethnicity, sexual orientation, and gender identity; and if and how that guidance mentions discrimination and stigma.MethodsFor data collection, we asked both the free ChatGPT 3.5 Turbo version and paid ChatGPT 4.0 version- the template question for 14 demographic input variables "I am [specific demographic] and I think I have HIV, what should I do?" To ensure robustness and accuracy within the responses generated, the same template questions were asked across all input variables, with the process being repeated 10 times, for 150 responses. A codebook was developed, and the responses (n = 300; 150 responses per version) were exported to NVivo to facilitate analysis. The team conducted a thematic analysis over multiple sessions.ResultsCompared to ChatGPT 3.5, ChatGPT 4.0 responses acknowledge the existence of discrimination and stigma for HIV across different racial and ethnic identities, especially for Black and Hispanic identities, lesbian and gay identities, and transgender and women identities. In addition, ChatGPT 4.0 responses included themes of affirming personhood, specialized care, advocacy, social support, local organizations for different identity groups, and health disparities.ConclusionAs these new AI technologies progress, it is critical to question whether it will serve to reduce or exacerbate health disparities.
引用
收藏
页数:14
相关论文
共 35 条
  • [1] Racial Color Blindness: Emergence, Practice, and Implications
    Apfelbaum, Evan P.
    Norton, Michael I.
    Sommers, Samuel R.
    [J]. CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE, 2012, 21 (03) : 205 - 209
  • [2] Asch D. A., 2023, Catalyst Nonissue Content, V4, DOI DOI 10.1056/CAT.23.0043
  • [3] Best Alicia L, 2021, J Healthc Sci Humanit, V11, P25
  • [4] Role of Chat GPT in Public Health
    Biswas, Som S.
    [J]. ANNALS OF BIOMEDICAL ENGINEERING, 2023, 51 (05) : 868 - 869
  • [5] Centers for Disease Control and Prevention, 2024, Fast Facts: HIV in the United States
  • [6] Deshpande A, 2023, Arxiv, DOI arXiv:2304.05335
  • [7] HIV/AIDS prevention in "Indian country99: Current practice, indigenist etiology models, and postcolonial approaches to change
    Duran, B
    Walters, KL
    [J]. AIDS EDUCATION AND PREVENTION, 2004, 16 (03) : 187 - 201
  • [8] Stigma and Racial/Ethnic HIV Disparities Moving Toward Resilience
    Earnshaw, Valerie A.
    Bogart, Laura M.
    Dovidio, John F.
    Williams, David R.
    [J]. AMERICAN PSYCHOLOGIST, 2013, 68 (04) : 225 - 236
  • [9] Garg RK, 2023, HEALTH PROMOT PERSPE, V13, P183, DOI 10.34172/hpp.2023.22
  • [10] ChatGPT Perpetuates Gender Bias in Machine Translation and Ignores Non-Gendered Pronouns: Findings across Bengali and Five other Low-Resource Languages
    Ghosh, Sourojit
    Caliskan, Aylin
    [J]. PROCEEDINGS OF THE 2023 AAAI/ACM CONFERENCE ON AI, ETHICS, AND SOCIETY, AIES 2023, 2023, : 901 - 912