Exploring Effects of Chatbot's Interpretation and Self-disclosure on Mental Illness Stigma

被引:0
作者
Cui Y. [1 ]
Lee Y.-J. [2 ]
Jamieson J. [3 ]
Yamashita N. [3 ]
Lee Y.-C. [2 ]
机构
[1] Cornell Tech, New York City, NY
[2] National University of Singapore, Singapore
[3] NTT, Keihanna
关键词
Chatbots; Conversational Agents; Mental Illness; Public Stigma; Social Stigma;
D O I
10.1145/3637329
中图分类号
学科分类号
摘要
Chatbots are increasingly being used in mental healthcare - e.g., for assessing mental health conditions and providing digital counseling - and have been found to have considerable potential for facilitating people's behavioral changes. Nevertheless, little research has examined how specific chatbot designs may help reduce public stigmatization of mental illness. To help fill that gap, this study explores how stigmatizing attitudes toward mental illness may be affected by conversations with chatbots that have 1) varying ways of expressing their interpretations of participants' statements and 2) different styles of self-disclosure. More specifically, we implemented and tested four chatbot designs that varied in terms of whether they interpreted participants' comments as stigmatizing or non-stigmatizing, and whether they provided stigmatizing, non-stigmatizing, or no self-disclosure of chatbots' own views. Over the two-week period of the experiment, all four chatbots' conversations with our participants centered on seven mental illness vignettes, all featuring the same character. We found that the chatbot featuring non-stigmatizing interpretations and non-stigmatizing self-disclosure performed best at reducing the participants' stigmatizing attitudes, while the one that provided stigmatizing interpretations and stigmatizing self-disclosures had the least beneficial effect. We also discovered side effects of chatbots' self-disclosure: notably, chatbots were perceived to have inflexible and strong opinions, which undermined their credibility. As such, this paper contributes to knowledge about how chatbot designs shape users' perceptions of the chatbots themselves, and how chatbots' interpretation and self-disclosure may be leveraged to help reduce mental illness stigma. © 2024 Copyright held by the owner/author(s). Publication rights licensed to ACM.
引用
收藏
相关论文
共 119 条
  • [1] Alem A., Jacobsson L., Araya M., Kebede D., Kullgren G., How are mental disorders seen and where is help sought in a rural Ethiopian community? A key informant study in Butajira, Ethiopia, Acta Psychiatrica Scandinavica, 100, pp. 40-47, (1999)
  • [2] Altman I., Taylor D.A., Social penetration: The development of interpersonal relationships, (1973)
  • [3] Ashktorab Z., Jain M., Liao Q.V., Weisz J.D., Resilient chatbots: Repair strategy preferences for conversational breakdowns, Proceedings of the 2019 CHI conference on human factors in computing systems, pp. 1-12, (2019)
  • [4] Bamuhair S.S., Al Farhan A.I., Althubaiti A., Agha S., Rahman S., Ibrahim N.O., Sources of stress and coping strategies among undergraduate medical students enrolled in a problem-based learning curriculum, depression, 20, (2015)
  • [5] Batterham P.J., Calear A.L., Christensen H., The Stigma of Suicide Scale: Psychometric properties and correlates of the stigma of suicide, Crisis: The Journal of Crisis Intervention and Suicide Prevention, 34, 1, (2013)
  • [6] Berg H., Antonsen P., Binder P.-E., Sincerely speaking: Why do psychotherapists self-disclose in therapy?-A qualitative hermeneutic phenomenological study, Nordic psychology, 69, 3, pp. 143-159, (2017)
  • [7] Bickmore T., Cassell J., Small talk and conversational storytelling in embodied conversational interface agents, AAAI fall symposium on narrative intelligence, pp. 87-92, (1999)
  • [8] Bond K.S., Jorm A.F., Kitchener B.A., Reavley N.J., Mental health first aid training for Australian medical and nursing students: an evaluation study, BMC psychology, 3, 1, pp. 1-9, (2015)
  • [9] Boucher E.M., Harake N.R., Ward H.E., Stoeckl S.E., Vargas J., Minkel J., Parks A.C., Zilca R., Artificially intelligent chatbots in digital mental health interventions: a review, Expert Review of Medical Devices, 18, pp. 37-49, (2021)
  • [10] Bradley P., Bots and data quality on crowdsourcing platforms, (2018)