AI as a moral crumple zone: The effects of AI-mediated communication on attribution and trust

被引:84
作者
Hohenstein, Jess [1 ]
Jung, Malte [1 ]
机构
[1] Cornell Univ, Dept Informat Sci, 107 Hoy Rd, Ithaca, NY 14853 USA
基金
美国国家科学基金会;
关键词
Artificial intelligence (AI); Communication; AI-Mediated Communication (AI-MC); Computer-Mediated Communication (CMC); Trust; Attribution; SOCIAL COGNITIVE THEORY; TECHNICAL AGENCY; ROBOT; BEHAVIOR; TEAMS; ACCOUNTABILITY; ORGANIZATIONS; OTHERS;
D O I
10.1016/j.chb.2019.106190
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
AI-mediated communication (AI-MC) represents a new paradigm where communication is augmented or generated by an intelligent system. As AI-MC becomes more prevalent, it is important to understand the effects that it has on human interactions and interpersonal relationships. Previous work tells us that in human interactions with intelligent systems, misattribution is common and trust is developed and handled differently than in interactions between humans. This study uses a 2 (successful vs. unsuccessful conversation) x 2 (standard vs. AI-mediated messaging app) between subjects design to explore whether AI mediation has any effects on attribution and trust. We show that the presence of AI-generated smart replies serves to increase perceived trust between human communicators and that, when things go awry, the AI seems to be perceived as a coercive agent, allowing it to function like a moral crumple zone and lessen the responsibility assigned to the other human communicator. These findings suggest that smart replies could be used to improve relationships and perceptions of conversational outcomes between interlocutors. Our findings also add to existing literature regarding perceived agency in smart agents by illustrating that in this type of AI-MC, the AI is considered to have agency only when communication goes awry.
引用
收藏
页数:13
相关论文
共 90 条
[1]  
[Anonymous], SOFT COMPUT
[2]  
[Anonymous], 1999, P SIGCHI C HUMAN FAC, DOI DOI 10.1145/302979.303001
[3]  
[Anonymous], 2005, INTRO PRACTICE STAT
[4]  
[Anonymous], 2011, P 2011 C EMPIRICAL M
[5]  
[Anonymous], 2016, Moral Crumple Zones: Cautionary Tales in Human-Robot Interaction (We Robot 2016)
[6]  
[Anonymous], P 10 ANN ACM IEEE IN
[7]  
Aronson Elliot., 1972, The Social Animal, V10th
[8]   SOCIAL COGNITIVE THEORY OF SELF-REGULATION [J].
BANDURA, A .
ORGANIZATIONAL BEHAVIOR AND HUMAN DECISION PROCESSES, 1991, 50 (02) :248-287
[9]   Growing primacy of human agency in adaptation and change in the electronic era [J].
Bandura, A .
EUROPEAN PSYCHOLOGIST, 2002, 7 (01) :2-16
[10]   Social cognitive theory: An agentic perspective [J].
Bandura, A .
ANNUAL REVIEW OF PSYCHOLOGY, 2001, 52 :1-26