Evaluation of Robot Emotion Expressions for Human-Robot Interaction

被引:1
作者
Cardenas, Pedro [1 ]
Garcia, Jose [1 ]
Begazo, Rolinson [1 ]
Aguilera, Ana [2 ,4 ]
Dongo, Irvin [1 ,3 ]
Cardinale, Yudith [1 ,4 ,5 ]
机构
[1] Univ Catolica San Pablo, Elect & Elect Engn Dept, Arequipa 04001, Peru
[2] Univ Valparaiso, Escuela Ingn Informat, Valparaiso 2340000, Chile
[3] Univ Bordeaux, ESTIA Inst Technol, F-64210 Bidart, France
[4] Univ Valparaiso, Interdisciplinary Ctr Biomed Res & Hlth Engn MEDIN, Valparaiso 2340000, Chile
[5] Univ Int Valencia, Ctr Estudios Ciencia Datos Inteligencia Artificial, Valencia 46002, Spain
关键词
Social robotics; Human-robot interaction; Emotional movements; Emotion perception; Humanoid robots; PERCEPTION;
D O I
10.1007/s12369-024-01167-5
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Emotion recognition has fostered more suitable and effective human-robot interaction (HRI). In particular, social robots have to imitate the expression of feeling through their voices and body gestures in order to ameliorate this interaction. However, robot's hardware limitations (few joints and computational resources) may restrict the quality of robot's expressions. To contribute to this area, we conducted a study on how emotions are expressed by humans through gestures, body language, and movements. This study allows understanding universal representation of emotions (movements and gestures) and designing similar movements for robots, despite their hardware limitations. Based on that, we develop and evaluate an emotional interaction system for robots, specifically for Pepper robot. This system utilizes verbal emotion recognition, based on deep learning techniques to interpret and respond with movements and emojis, thus enriching the dynamics of HRI. We implemented two versions of such as interaction system: on board implementation (the emotion recognition process is executed by the robot) and a server-based implementation (the emotion recognition is performed by an external server connected to the robot). We assessed the performance of both versions, as well as the acceptance of robot expressions for HRI. Results show that the combined use of emotional movements and emojis by robot significantly improves the accuracy of emotional conveyance.
引用
收藏
页码:2019 / 2041
页数:23
相关论文
共 58 条
[11]   From stopping to shopping: An observational study comparing a humanoid service robot with a tablet service kiosk to attract and convert shoppers [J].
Brengman, Malaika ;
De Gauquier, Laurens ;
Willems, Kim ;
Vanderborght, Bram .
JOURNAL OF BUSINESS RESEARCH, 2021, 134 :263-274
[12]  
Bresin R, 2021, WORKSH SOUND HUM ROB
[13]  
Buchem I, 2022, ICERI2022 P, P6545
[14]   To Err Is Human(-oid): How Do Consumers React to Robot Service Failure and Recovery? [J].
Choi, Sungwoo ;
Mattila, Anna S. ;
Bolton, Lisa E. .
JOURNAL OF SERVICE RESEARCH, 2021, 24 (03) :354-371
[15]   The future of service: The power of emotion in human-robot interaction [J].
Chuah, Stephanie Hui-Wen ;
Yu, Joanne .
JOURNAL OF RETAILING AND CONSUMER SERVICES, 2021, 61
[16]   Using the Geneva Emotion Wheel to Measure Perceived Affect in Human-Robot Interaction [J].
Coyne, Adam K. ;
Murtagh, Andrew ;
McGinn, Conor .
PROCEEDINGS OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI '20), 2020, :491-498
[17]   The bodily expressive action stimulus test (BEAST). Construction and validation of a stimulus basis for measuring perception of whole body expression of emotions [J].
de Gelder, Beatrice ;
Van den Stock, Jan .
FRONTIERS IN PSYCHOLOGY, 2011, 2
[18]   A Computer Vision based Approach for Understanding Emotional Involvements in Children with Autism Spectrum Disorders [J].
Del Coco, Marco ;
Leo, Marco ;
Carcagni, Pierluigi ;
Spagnolo, Paolo ;
Mazzeo, Pier Luigi ;
Bernava, Massimo ;
Marino, Flavia ;
Pioggia, Giovanni ;
Distante, Cosimo .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2017), 2017, :1401-1407
[19]  
Diaz M., 2011, Proceedings 2011 IEEE International Conference on Automatic Face & Gesture Recognition (FG 2011), P927, DOI 10.1109/FG.2011.5771375
[20]  
EKMAN P, 1973, STUD PSYCHOL, V15, P140