Improving Human-Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression

被引:22
作者
Filippini, Chiara [1 ]
Perpetuini, David [1 ]
Cardone, Daniela [1 ]
Merla, Arcangelo [1 ]
机构
[1] Univ G dAnnunzio, Dept Neurosci Imaging & Clin Sci, I-66100 Chieti, Italy
关键词
facial expression recognition; emotion recognition; human-robot interaction; affective computing; machine learning; EMOTION; RECOGNITION; VALIDATION;
D O I
10.3390/s21196438
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
An intriguing challenge in the human-robot interaction field is the prospect of endowing robots with emotional intelligence to make the interaction more genuine, intuitive, and natural. A crucial aspect in achieving this goal is the robot's capability to infer and interpret human emotions. Thanks to its design and open programming platform, the NAO humanoid robot is one of the most widely used agents for human interaction. As with person-to-person communication, facial expressions are the privileged channel for recognizing the interlocutor's emotional expressions. Although NAO is equipped with a facial expression recognition module, specific use cases may require additional features and affective computing capabilities that are not currently available. This study proposes a highly accurate convolutional-neural-network-based facial expression recognition model that is able to further enhance the NAO robot' awareness of human facial expressions and provide the robot with an interlocutor's arousal level detection capability. Indeed, the model tested during human-robot interactions was 91% and 90% accurate in recognizing happy and sad facial expressions, respectively; 75% accurate in recognizing surprised and scared expressions; and less accurate in recognizing neutral and angry expressions. Finally, the model was successfully integrated into the NAO SDK, thus allowing for high-performing facial expression classification with an inference time of 0.34 & PLUSMN; 0.04 s.
引用
收藏
页数:18
相关论文
共 43 条
[1]  
Amin MA, 2005, LECT NOTES ARTIF INT, V3613, P484
[2]  
[Anonymous], 2018, 13 INT C SOFT COMP M
[3]  
[Anonymous], 2005, Proceedings of Measuring Behavior 2005, 5th International Conference on Methods and Techniques in Behavioral Research
[4]  
[Anonymous], 2017, ARXIV171204711
[5]  
Bradski G., 2008, Learning OpenCV
[6]   Driver Stress State Evaluation by Means of Thermal Imaging: A Supervised Machine Learning Approach Based on ECG Signal [J].
Cardone, Daniela ;
Perpetuini, David ;
Filippini, Chiara ;
Spadolini, Edoardo ;
Mancini, Lorenza ;
Chiarelli, Antonio Maria ;
Merla, Arcangelo .
APPLIED SCIENCES-BASEL, 2020, 10 (16)
[7]  
Dahliani ER, 2020, J Phys Conf Ser, V1460, P12072, DOI [10.1088/1742-6596/1460/1/012072, DOI 10.1088/1742-6596/1460/1/012072]
[8]   ARE THERE BASIC EMOTIONS [J].
EKMAN, P .
PSYCHOLOGICAL REVIEW, 1992, 99 (03) :550-553
[9]  
Elfenbein H.A., 2002, Emotional Intelligence and the Recognition of Emotion from Facid Expressions
[10]   A Socially Assistive Robot Exercise Coach for the Elderly [J].
Fasola, Juan ;
Mataric, Maja J. .
JOURNAL OF HUMAN-ROBOT INTERACTION, 2013, 2 (02) :3-32