Toward Artificial Emotional Intelligence for Cooperative Social Human-Machine Interaction

被引:64
作者
Erol, Berat A. [1 ]
Majumdar, Abhijit [1 ]
Benavidez, Patrick [1 ]
Rad, Paul [2 ]
Choo, Kim-Kwang Raymond [2 ]
Jamshidi, Mo [1 ]
机构
[1] Univ Texas San Antonio, Autonomous Control Engn Labs, Dept Elect & Comp Engn, One UTSA Circle, San Antonio, TX 78249 USA
[2] Univ Texas San Antonio, Dept Informat Syst & Cyber Secur, One UTSA Circle, San Antonio, TX 78249 USA
关键词
Assistive robotics; human-machine interactions; humanoid robot; Internet of robotic things; smart home; supervisory control; FACIAL EXPRESSION; ACTION UNITS; TRACKING; CLASSIFICATION;
D O I
10.1109/TCSS.2019.2922593
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The aptitude to identify the emotional states of others and response to exposed emotions is an important aspect of human social intelligence. Robots are expected to be prevalent in society to assist humans in various tasks. Human-robot interaction (HRI) is of critical importance in the assistive robotics sector. Smart digital assistants and assistive robots fail quite often when a request is not well defined verbally. When the assistant fails to provide services as desired, the person may exhibit an emotional response such as anger or frustration through expressions in their face and voice. It is critical that robots understand not only the language, but also human psychology. A novel affection-based perception architecture for cooperative HRIs is studied in this paper, where the agent is expected to recognize human emotional states, thus encourages a natural bonding between the human and the robotic artifact. We propose a method to close the loop using measured emotions to grade HRIs. This metric will be used as a reward mechanism to adjust the assistant's behavior adaptively. Emotion levels from users are detected through vision and speech inputs processed by deep neural networks (NNs). Negative emotions exhibit a change in performance until the user is satisfied.
引用
收藏
页码:234 / 246
页数:13
相关论文
共 43 条
[1]  
[Anonymous], IEEE T BIG DATA
[2]  
[Anonymous], 2016, ARXIV161100851
[3]  
[Anonymous], 2014, What I learned from competing against a ConvNet on ImageNet
[4]  
[Anonymous], 2017, ARXIV170204280
[5]  
[Anonymous], 2003, P 5 INT C MULT INT, DOI [10.1145/958432.958479, 10.1145/958468, DOI 10.1145/958468]
[6]  
[Anonymous], 2016, MAT CHAR MOD METH
[7]  
[Anonymous], 2016, P COMPUTER VISION EC
[8]  
[Anonymous], FDN TRENDS HUMAN COM
[9]   Real-time classification of evoked emotions using facial feature tracking and physiological responses [J].
Bailenson, Jeremy N. ;
Pontikakis, Emmanuel D. ;
Mauss, Iris B. ;
Gross, James J. ;
Jabon, Maria E. ;
Hutcherson, Cendri A. C. ;
Nass, Clifford ;
John, Oliver .
INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 2008, 66 (05) :303-317
[10]  
Bartlett MS, 2005, PROC CVPR IEEE, P568