The Social Triad Model: Considering the Deployer in a Novel Approach to Trust in Human-Robot Interaction

被引:4
作者
Cameron, David [1 ]
Collins, Emily C. [2 ]
de Saille, Stevienna [3 ]
Eimontaite, Iveta [4 ]
Greenwood, Alice [1 ]
Law, James [5 ]
机构
[1] Univ Sheffield, Informat Sch, Sheffield S10 2TN, England
[2] Northeastern Univ, Inst Experiential Robot, Boston, MA 02115 USA
[3] Univ Sheffield, Dept Sociol Studies, Sheffield S10 2TN, England
[4] Cranfield Univ, Sch Aerosp Transport & Mfg, Bedford MK43 0AL, England
[5] Univ Sheffield, Dept Comp Sci, Sheffield S10 2TN, England
基金
英国工程与自然科学研究理事会;
关键词
Trust; Human-human interaction; Transparency; Deployer; AUTOMATION; METAANALYSIS; PERFORMANCE; COGNITION; SCALE;
D O I
10.1007/s12369-023-01048-3
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
There is an increasing interest in considering, measuring, and implementing trust in human-robot interaction (HRI). New avenues in this field include identifying social means for robots to influence trust, and identifying social aspects of trust such as a perceptions of robots' integrity, sincerity or even benevolence. However, questions remain regarding robots' authenticity in obtaining trust through social means and their capacity to increase such experiences through social interaction with users. We propose that the dyadic model of HRI misses a key complexity: a robot's trustworthiness may be contingent on the user's relationship with, and opinion of, the individual or organisation deploying the robot (termed here, Deployer). We present a case study in three parts on researching HRI and a LEGO (R) Serious (R) Play focus group on care robotics to indicate how Users' trust towards the Deployer can affect trust towards robots and robotic research. Our Social Triad model (User, Robot, Deployer) offers novel avenues for exploring trust in a social context.
引用
收藏
页码:1405 / 1418
页数:14
相关论文
共 89 条
[41]   THE DYADIC TRUST SCALE - TOWARD UNDERSTANDING INTERPERSONAL-TRUST IN CLOSE RELATIONSHIPS [J].
LARZELERE, RE ;
HUSTON, TL .
JOURNAL OF MARRIAGE AND THE FAMILY, 1980, 42 (03) :595-604
[42]  
Law Theresa., 2021, Trust in Human-Robot Interaction, P27, DOI [DOI 10.1016/B978-0-12-819472-0.00002-2, 10.1016/B978-0-12-819472-0.00002-2]
[43]   Trust in automation: Designing for appropriate reliance [J].
Lee, JD ;
See, KA .
HUMAN FACTORS, 2004, 46 (01) :50-80
[44]   TRUST, SELF-CONFIDENCE, AND OPERATORS ADAPTATION TO AUTOMATION [J].
LEE, JD ;
MORAY, N .
INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 1994, 40 (01) :153-184
[45]   Robotic foot-in-the-door: Using sequential-request persuasive strategies in human-robot interaction [J].
Lee, Seungcheol Austin ;
Liang, Yuhua .
COMPUTERS IN HUMAN BEHAVIOR, 2019, 90 :351-356
[46]   Corporate apology and crisis communication: The effect of responsibility admittance and sympathetic expression on public's anger relief [J].
Lee, Suman ;
Chung, Surin .
PUBLIC RELATIONS REVIEW, 2012, 38 (05) :932-934
[47]  
Lelii M, 2022, TRUST GOVT I OPINION
[48]   A Cross-cultural Study: Effect of Robot Appearance and Task [J].
Li, Dingjun ;
Rau, P. L. Patrick ;
Li, Ye .
INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2010, 2 (02) :175-186
[49]   The Role of Decision Authority and Stated Social Intent as Predictors of Trust in Autonomous Robots [J].
Lyons, Joseph B. ;
Jessup, Sarah A. ;
Vo, Thy Q. .
TOPICS IN COGNITIVE SCIENCE, 2024, 16 (03) :430-449
[50]  
Machida E, 2012, 2012 PROCEEDINGS OF SICE ANNUAL CONFERENCE (SICE), P2207