“If it is easy to understand, then it will have value”: Examining Perceptions of Explainable AI with Community Health Workers in Rural India

被引:2
作者
Okolo C.T. [1 ]
Agarwal D. [2 ]
Dell N. [3 ]
Vashistha A. [2 ]
机构
[1] Cornell University, 350 Gates Hall, Ithaca, NY
[2] Information Science, Cornell University, Ithaca, NY
[3] Information Science, Cornell Tech, New York, NY
基金
美国国家科学基金会;
关键词
Artificial Intelligence; Community Health Workers; Explainability; Global South; HCI4D; ICTD; Machine Learning; Mobile Health; XAI4D;
D O I
10.1145/3637348
中图分类号
学科分类号
摘要
AI-driven tools are increasingly deployed to support low-skilled community health workers (CHWs) in hard-to-reach communities in the Global South. This paper examines how CHWs in rural India engage with and perceive AI explanations and how we might design explainable AI (XAI) interfaces that are more understandable to them. We conducted semi-structured interviews with CHWs who interacted with a design probe to predict neonatal jaundice in which AI recommendations are accompanied by explanations. We (1) identify how CHWs interpreted AI predictions and the associated explanations, (2) unpack the benefits and pitfalls they perceived of the explanations, and (3) detail how different design elements of the explanations impacted their AI understanding. Our findings demonstrate that while CHWs struggled to understand the AI explanations, they nevertheless expressed a strong preference for the explanations to be integrated into AI-driven tools and perceived several benefits of the explanations, such as helping CHWs learn new skills and improved patient trust in AI tools and in CHWs. We conclude by discussing what elements of AI need to be made explainable to novice AI users like CHWs and outline concrete design recommendations to improve the utility of XAI for novice AI users in non-Western contexts. © 2024 Copyright held by the owner/author(s). Publication rights licensed to ACM.
引用
收藏
相关论文
共 136 条
[1]  
Abdelnour-Nocera J., Clemmensen T., Kurosu M., Reframing HCI through local and indigenous perspectives, International Journal of Human-Computer Interaction, 29, 4, pp. 201-204, (2013)
[2]  
Cough Against Covid, (2020)
[3]  
Newborn Anthropometry, (2020)
[4]  
Alqaraawi A., Schuessler M., Weiss P., Costanza E., Berthouze N., Evaluating saliency map explanations for convolutional neural networks: a user study, Proceedings of the 25th International Conference on Intelligent User Interfaces, pp. 275-285, (2020)
[5]  
Anik A.I., Bunt A., Data-centric explanations: explaining training data of machine learning systems to promote transparency, Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1-13, (2021)
[6]  
Arrieta A.B., Diaz-Rodriguez N., Ser J.D., Bennetot A., Tabik S., Barbado A., Garcia S., Gil-Lopez S., Molina D., Benjamins R., Et al., Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI, Information Fusion, 58, 2020, pp. 82-115, (2020)
[7]  
Bansal G., Wu T., Zhou J., Fok R., Nushi B., Kamar E., Ribeiro M.T., Weld D., Does the whole exceed its parts? the effect of ai explanations on complementary team performance, Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1-16, (2021)
[8]  
Baqui A.H., El-Arifeen S., Darmstadt G.L., Ahmed S., Williams E.K., Seraji H.R., Mannan I., Rahman S.M., Shah R., Saha S.K., Et al., Effect of community-based newborn-care intervention package implemented through two service-delivery strategies in Sylhet district, Bangladesh: a cluster-randomised controlled trial, The lancet, 371, 9628, pp. 1936-1944, (2008)
[9]  
Barrett L.F., Adolphs R., Marsella S., Martinez A.M., Pollak S.D., Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements, Psychological science in the public interest, 20, 1, pp. 1-68, (2019)
[10]  
Beede E., Baylor E., Hersch F., Iurchenko A., Wilcox L., Ruamviboonsuk P., Vardoulakis L.M., A human-centered evaluation of a deep learning system deployed in clinics for the detection of diabetic retinopathy, Proceedings of the 2020 CHI conference on human factors in computing systems, pp. 1-12, (2020)