Information-Driven Multirobot Behavior Adaptation to Emotional Intention in Human Robot Interaction

被引:33
作者
Chen, Luefeng [1 ,2 ]
Wu, Min [1 ,2 ]
Zhou, Mengtian [1 ,2 ]
She, Jinhua [1 ,2 ,3 ]
Dong, Fangyan [4 ]
Hirota, Kaoru [4 ]
机构
[1] China Univ Geosci, Sch Automat, Wuhan 430074, Hubei, Peoples R China
[2] Hubei Key Lab Adv Control & Intelligent Automat C, Wuhan 430074, Hubei, Peoples R China
[3] Tokyo Univ Technol, Sch Engn, Tokyo 1920982, Japan
[4] Tokyo Inst Technol, Tokyo 2268502, Japan
基金
中国国家自然科学基金;
关键词
Behavior adaptation; emotional intention understanding; human-robot interaction (HRI); information-driven; reinforcement learning; FACE RECOGNITION; REINFORCEMENT; REPRESENTATION; FEATURES;
D O I
10.1109/TCDS.2017.2728003
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
To adapt robots' behavior to human emotional intention, an information-driven multirobot behavior adaptation mechanism is proposed in human-robot interaction (HRI). In the mechanism, optimal policy of behavior is selected by information-driven fuzzy friend-Q learning (IDFFQ), and facial expression with identification information are used to understand human emotional intention. It aims to make robots be capable of understanding and adapting their behaviors to human emotional intention, in such a way that HRI runs smoothly. Simulation experiments are performed according to a scenario of drinking at a bar. Results show that the proposed IDFFQ reduces 51 learning steps compared to the fuzzy production rule-based friend-Q learning (FPRFQ), and computational time is about 1/4 of the time consumed in FPRFQ. In Addition, the accuracy of emotion recognition and emotional intention understanding are 80.36% and 85.71%, respectively. The preliminary application experiments are carried out to the developing emotional social robot system, and the basic experimental results are shown in the scenario of drinking at a bar with three emotional robots and 12 volunteers.
引用
收藏
页码:647 / 658
页数:12
相关论文
共 46 条
[1]   Adaptation of manipulation skills in physical contact with the environment to reference force profiles [J].
Abu-Dakka, Fares J. ;
Nemec, Bojan ;
Jorgensen, Jimmy A. ;
Savarimuthu, Thiusius R. ;
Kruger, Norbert ;
Ude, Ales .
AUTONOMOUS ROBOTS, 2015, 39 (02) :199-217
[2]  
Ahlberg J., 2002, THESIS
[3]  
[Anonymous], 1994, P 11 INT C INT C MAC
[4]  
[Anonymous], 2001, Proceedings of The 18th International Conference on Machine Learning, DOI DOI 10.5555/645530.655661
[5]  
[Anonymous], 2004, Journal of machine learning research, DOI DOI 10.1162/1532443041827880
[6]   Multimodal emotional state recognition using sequence-dependent deep hierarchical features [J].
Barros, Pablo ;
Jirak, Doreen ;
Weber, Cornelius ;
Wermter, Stefan .
NEURAL NETWORKS, 2015, 72 :140-151
[7]  
Busoniu L, 2010, STUD COMPUT INTELL, V310, P183
[8]   Emotion-Age-Gender-Nationality Based Intention Understanding in Human-Robot Interaction Using Two-Layer Fuzzy Support Vector Regression [J].
Chen, Lue-Feng ;
Liu, Zhen-Tao ;
Wu, Min ;
Ding, Min ;
Dong, Fang-Yan ;
Hirota, Kaoru .
INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2015, 7 (05) :709-729
[9]   Adapting Multi-Robot Behavior to Communication Atmosphere in Humans-Robots Interaction Using Fuzzy Production Rule Based Friend-Q Learning [J].
Chen, Lue-Feng ;
Liu, Zhen-Tao ;
Dong, Fang-Yan ;
Yamazaki, Yoichi ;
Wu, Min ;
Hirota, Kaoru .
JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS, 2013, 17 (02) :291-301
[10]   Multi-robot behavior adaptation to local and global communication atmosphere in humans-robots interaction [J].
Chen, Lue-Feng ;
Liu, Zhen-Tao ;
Wu, Min ;
Dong, Fang-Yan ;
Yamazaki, Yoichi ;
Hirota, Kaoru .
JOURNAL ON MULTIMODAL USER INTERFACES, 2014, 8 (03) :289-303