Your robot therapist is not your therapist: understanding the role of AI-powered mental health chatbots

被引:45
作者
Khawaja, Zoha [1 ]
Belisle-Pipon, Jean-Christophe [1 ]
机构
[1] Simon Fraser Univ, Fac Hlth Sci, Burnaby, BC, Canada
来源
FRONTIERS IN DIGITAL HEALTH | 2023年 / 5卷
关键词
artificial intelligence; chatbot; mental health services; therapeutic misconception; AI ethics; THERAPEUTIC MISCONCEPTION; ETHICS; ISSUES;
D O I
10.3389/fdgth.2023.1278186
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
Artificial intelligence (AI)-powered chatbots have the potential to substantially increase access to affordable and effective mental health services by supplementing the work of clinicians. Their 24/7 availability and accessibility through a mobile phone allow individuals to obtain help whenever and wherever needed, overcoming financial and logistical barriers. Although psychological AI chatbots have the ability to make significant improvements in providing mental health care services, they do not come without ethical and technical challenges. Some major concerns include providing inadequate or harmful support, exploiting vulnerable populations, and potentially producing discriminatory advice due to algorithmic bias. However, it is not always obvious for users to fully understand the nature of the relationship they have with chatbots. There can be significant misunderstandings about the exact purpose of the chatbot, particularly in terms of care expectations, ability to adapt to the particularities of users and responsiveness in terms of the needs and resources/treatments that can be offered. Hence, it is imperative that users are aware of the limited therapeutic relationship they can enjoy when interacting with mental health chatbots. Ignorance or misunderstanding of such limitations or of the role of psychological AI chatbots may lead to a therapeutic misconception (TM) where the user would underestimate the restrictions of such technologies and overestimate their ability to provide actual therapeutic support and guidance. TM raises major ethical concerns that can exacerbate one's mental health contributing to the global mental health crisis. This paper will explore the various ways in which TM can occur particularly through inaccurate marketing of these chatbots, forming a digital therapeutic alliance with them, receiving harmful advice due to bias in the design and algorithm, and the chatbots inability to foster autonomy with patients.
引用
收藏
页数:13
相关论文
共 67 条
[31]   Waiting for a digital therapist: three challenges on the path to psychotherapy delivered by artificial intelligence [J].
Grodniewicz, J. P. ;
Hohol, Mateusz .
FRONTIERS IN PSYCHIATRY, 2023, 14
[32]   AI-Based and Digital Mental Health Apps: Balancing Need and Risk [J].
Hamdoun, Salah ;
Monteleone, Rebecca ;
Bookman, Terri ;
Michael, Katina .
IEEE TECHNOLOGY AND SOCIETY MAGAZINE, 2023, 42 (01) :25-36
[33]  
Happify, HAPP SCI BAS ACT GAM
[34]  
Health C for D and R. FDA. FDA, 2022, Digital Health Software Precertification (Pre-Cert) Pilot Program
[35]  
Health C for D and R. FDA. FDA, 2023, Digital Health Center of Excellence
[36]   Clinical trials and medical care: Defining the therapeutic misconception [J].
Henderson, Gail E. ;
Churchill, Larry R. ;
Davis, Arlene M. ;
Easter, Michele M. ;
Grady, Christine ;
Joffe, Steven ;
Kass, Nancy ;
King, Nancy M. P. ;
Lidz, Charles W. ;
Miller, Franklin G. ;
Nelson, Daniel K. ;
Peppercorn, Jeffrey ;
Rothschild, Barbra Bluestone ;
Sankar, Pamela ;
Wilfond, Benjamin S. ;
Zimmer, Catherine R. .
PLOS MEDICINE, 2007, 4 (11) :1735-1738
[37]   Vulnerability in research and health care; Describing the elephant in the room? [J].
Hurst, Samia A. .
BIOETHICS, 2008, 22 (04) :191-202
[38]   An Empathy-Driven, Conversational Artificial Intelligence Agent (Wysa) for Digital Mental Well-Being: Real-World Data Evaluation Mixed-Methods Study [J].
Inkster, Becky ;
Sarda, Shubhankar ;
Subramanian, Vinod .
JMIR MHEALTH AND UHEALTH, 2018, 6 (11)
[39]   Clinical, ethical, and legal issues in e-therapy [J].
Kanani, K ;
Regehr, C .
FAMILIES IN SOCIETY-THE JOURNAL OF CONTEMPORARY HUMAN SERVICES, 2003, 84 (02) :155-162
[40]   The Ethics of Smart Pills and Self-Acting Devices: Autonomy, Truth-Telling, and Trust at the Dawn of Digital Medicine [J].
Klugman, Craig M. ;
Dunn, Laura B. ;
Schwartz, Jack ;
Cohen, I. Glenn .
AMERICAN JOURNAL OF BIOETHICS, 2018, 18 (09) :38-47