A Sociable Human-robot Interaction Scheme Based on Body Emotion Analysis

被引:0
作者
Tehao Zhu
Zeyang Xia
Jiaqi Dong
Qunfei Zhao
机构
[1] Shanghai Jiao Tong University,Department of Automation
[2] Chinese Academy of Sciences,Shenzhen Institutes of Advanced Technology
来源
International Journal of Control, Automation and Systems | 2019年 / 17卷
关键词
Body emotion analysis; finite state machin; fuzzy inference; human-robot interaction; Laban movement analysis;
D O I
暂无
中图分类号
学科分类号
摘要
Many kinds of interaction schemes for human-robot interaction (HRI) have been reported in recent years. However, most of these schemes are realized by recognizing the human actions. Once the recognition algorithm fails, the robot’s reactions will not be able to proceed further. This issue is thoughtless in traditional HRI, but is the key point to further improve the fluency and friendliness of HRI. In this work, a sociable HRI (SoHRI) scheme based on body emotion analysis was developed to achieve reasonable and natural interaction while human actions were not recognized. First, the emotions from the dynamic movements and static poses of humans were quantified using Laban movement analysis. Second, an interaction strategy including a finite state machine model was designed to describe the transition regulations of the human emotion state. Finally, appropriate interactive behavior of the robot was selected according to the inferred human emotion state. The quantification effect of SoHRI was verified using the dataset UTD-MHAD, and the whole scheme was tested using questionnaires filled out by the participants and spectators. The experimental results showed that the SoHRI scheme can analyze the body emotion precisely, and help the robot make reasonable interactive behaviors.
引用
收藏
页码:474 / 485
页数:11
相关论文
共 51 条
[1]  
Reddy K. K.(2013)Recognizing 50 human action categories of web videos Machine Vision and Applications 24 971-981
[2]  
Shah M.(2014)Evolutionary joint selection to improve human action recognition with RGB–D devices Expert Systems with Applications 41 786-794
[3]  
Chaaraoui A. A.(2016)Real–time human action recognition based on depth motion maps Journal of Real–time Image Processing 12 155-163
[4]  
Padilla–López J. R.(2015)Human facial expression recognition using stepwise linear discriminant analysis and hidden conditional random fields IEEE Transactions on Image Processing 24 1386-1398
[5]  
Climent–Pérez P.(2008)Processing F0 with cochlear implants: Modulation frequency discrimination and speech intonation recognition Hearing Research 235 143-156
[6]  
Flórez–Revuelta F.(2016)Dynamical Integration of Language and Behavior in a Recurrent Neural Network for Human–Robot Interaction Frontiers in Neurorobotics 10 6014-17
[7]  
Chen C.(2013)Using ‘human state aware’ robots to enhance physical human–robot interaction in a cooperative scenario Computer Methods and Programs in Biomedicine 112 250-259
[8]  
Liu K.(2016)Data–driven HRI: learning social behaviors by example from humanhuman interaction IEEE Transactions on Robotics 32 988-1008
[9]  
Kehtarnavaz N.(2013)Communication atmosphere in humans and robots interaction based on the concept of fuzzy atmosfield generated by emotional states of humans and robots Journal of Automation Mobile Robotics and Intelligent Systems 7 52-63
[10]  
Siddiqi M. H.(2009)Digitalize emotions to improve the quality life–analyzing movement for emotion application Journal of Aesthetic Education 168 64-69