Toward understanding social cues and signals in human-robot interaction: effects of robot gaze and proxemic behavior

被引:89
作者
Fiore, Stephen M. [1 ,2 ]
Wiltshire, Travis J. [2 ]
Lobato, Emilio J. C. [2 ]
Jentsch, Florian G. [2 ,3 ]
Huang, Wesley H. [4 ]
Axelrod, Benjamin [4 ]
机构
[1] Univ Cent Florida, Dept Philosophy, Cognit Sci Lab, Inst Simulat & Training, Orlando, FL 32826 USA
[2] Univ Cent Florida, Inst Simulat & Training, Orlando, FL 32826 USA
[3] Univ Cent Florida, Dept Psychol, Orlando, FL 32826 USA
[4] iRobot Corp, Bedford, MA USA
来源
FRONTIERS IN PSYCHOLOGY | 2013年 / 4卷
关键词
human robot interaction; intention recognition; social signals; social cues; social presence; theory of mind; emotion attribution; INTENTIONS; COGNITION; CHILDREN; SCALE;
D O I
10.3389/fpsyg.2013.00859
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
As robots are increasingly deployed in settings requiring social interaction, research is needed to examine the social signals perceived by humans when robots display certain social cues. In this paper, we report a study designed to examine how humans interpret social cues exhibited by robots. We first provide a brief overview of perspectives from social cognition in humans and how these processes are applicable to human robot interaction (HRI). We then discuss the need to examine the relationship between social cues and signals as a function of the degree to which a robot is perceived as a socially present agent. We describe an experiment in which social cues were manipulated on an iRobot Ava (TM) mobile robotics platform in a hallway navigation scenario. Cues associated with the robot's proxemic behavior were found to significantly affect participant perceptions of the robot's social presence and emotional state while cues associated with the robot's gaze behavior were not found to be significant. Further, regardless of the proxemic behavior, participants attributed more social presence and emotional states to the robot over repeated interactions than when they first interacted with it. Generally, these results indicate the importance for HRI research to consider how social cues expressed by a robot can differentially affect perceptions of the robot's mental states and intentions. The discussion focuses on implications for the design of robotic systems and future directions for research on the relationship between social cues and signals.
引用
收藏
页数:15
相关论文
共 50 条
[21]   Toward an Understanding of Trust Repair in Human-Robot Interaction: Current Research and Future Directions [J].
Baker, Anthony L. ;
Phillips, Elizabeth K. ;
Ullman, Daniel ;
Keebler, Joseph R. .
ACM TRANSACTIONS ON INTERACTIVE INTELLIGENT SYSTEMS, 2018, 8 (04)
[22]   A Probabilistic Framework for Autonomous Proxemic Control in Situated and Mobile Human-Robot Interaction [J].
Mead, Ross ;
Mataric, Maja J. .
HRI'12: PROCEEDINGS OF THE SEVENTH ANNUAL ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2012, :193-194
[23]   Towards trustworthiness and transparency in social human-robot interaction [J].
Cantucci, Filippo ;
Falcone, Rino .
PROCEEDINGS OF THE 2020 IEEE INTERNATIONAL CONFERENCE ON HUMAN-MACHINE SYSTEMS (ICHMS), 2020, :487-492
[24]   Effects of Gaze and Speech in Human-Robot Medical Interactions [J].
Diethelm, Isabella Glans ;
Hansen, Sara Skov ;
Leth, Frederikke Birkeholm ;
Fischer, Kerstin ;
Palinko, Oskar .
HRI '21: COMPANION OF THE 2021 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2021, :349-353
[25]   Social Network Engagement during Human-robot Interaction An Extended Abstract on a Fundamental Novel Field within Human-robot Interaction [J].
Hogenhuis, Ann .
COMPANION OF THE ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2023, 2023, :757-759
[26]   Gaze-Based Human-Robot Interaction by the Brunswick Model [J].
Berra, Riccardo ;
Setti, Francesco ;
Cristani, Marco .
IMAGE ANALYSIS AND PROCESSING - ICIAP 2019, PT II, 2019, 11752 :511-521
[27]   User feedback in human-robot interaction: Prosody, gaze and timing [J].
Skantze, Gabriel ;
Oertel, Catharine ;
Hjalmarsson, Anna .
14TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2013), VOLS 1-5, 2013, :1900-1904
[28]   Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation [J].
Yu, Mingxin ;
Lin, Yingzi ;
Schmidt, David ;
Wang, Xiangzhou ;
Wang, Yu .
JOURNAL OF EYE MOVEMENT RESEARCH, 2014, 7 (04)
[29]   Cognitive Impact of Anthropomorphized Robot Gaze: Anthropomorphic Gaze as Social Cues [J].
Spatola, Nicolas ;
Huguet, Pascal .
ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION, 2021, 10 (04)
[30]   I am a warm robot: the effects of temperature in physical human-robot interaction [J].
Park, Eunil ;
Lee, Jaeryoung .
ROBOTICA, 2014, 32 (01) :133-142