Toward Artificial Emotional Intelligence for Cooperative Social Human-Machine Interaction

被引:55
作者
Erol, Berat A. [1 ]
Majumdar, Abhijit [1 ]
Benavidez, Patrick [1 ]
Rad, Paul [2 ]
Choo, Kim-Kwang Raymond [2 ]
Jamshidi, Mo [1 ]
机构
[1] Univ Texas San Antonio, Autonomous Control Engn Labs, Dept Elect & Comp Engn, One UTSA Circle, San Antonio, TX 78249 USA
[2] Univ Texas San Antonio, Dept Informat Syst & Cyber Secur, One UTSA Circle, San Antonio, TX 78249 USA
来源
IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS | 2020年 / 7卷 / 01期
关键词
Assistive robotics; human-machine interactions; humanoid robot; Internet of robotic things; smart home; supervisory control; FACIAL EXPRESSION; ACTION UNITS; TRACKING; CLASSIFICATION;
D O I
10.1109/TCSS.2019.2922593
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The aptitude to identify the emotional states of others and response to exposed emotions is an important aspect of human social intelligence. Robots are expected to be prevalent in society to assist humans in various tasks. Human-robot interaction (HRI) is of critical importance in the assistive robotics sector. Smart digital assistants and assistive robots fail quite often when a request is not well defined verbally. When the assistant fails to provide services as desired, the person may exhibit an emotional response such as anger or frustration through expressions in their face and voice. It is critical that robots understand not only the language, but also human psychology. A novel affection-based perception architecture for cooperative HRIs is studied in this paper, where the agent is expected to recognize human emotional states, thus encourages a natural bonding between the human and the robotic artifact. We propose a method to close the loop using measured emotions to grade HRIs. This metric will be used as a reward mechanism to adjust the assistant's behavior adaptively. Emotion levels from users are detected through vision and speech inputs processed by deep neural networks (NNs). Negative emotions exhibit a change in performance until the user is satisfied.
引用
收藏
页码:234 / 246
页数:13
相关论文
共 45 条
  • [31] Implementation of artificial intelligence and machine learning-based methods in brain-computer interaction
    Barnova, Katerina
    Mikolasova, Martina
    Kahankova, Radana Vilimkova
    Jaros, Rene
    Kawala-Sterniuk, Aleksandra
    Snasel, Vaclav
    Mirjalili, Seyedali
    Pelc, Mariusz
    Martinek, Radek
    COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 163
  • [32] Toward advanced diagnosis and management of inherited arrhythmia syndromes: Harnessing the capabilities of artificial intelligence and machine learning
    Asatryan, Babken
    Bleijendaal, Hidde
    Wilde, Arthur A. M.
    HEART RHYTHM, 2023, 20 (10) : 1399 - 1407
  • [33] A Simple ERP Method for Quantitative Analysis of Cognitive Workload in Myoelectric Prosthesis Control and Human-Machine Interaction
    Deeny, Sean
    Chicoine, Caitlin
    Hargrove, Levi
    Parrish, Todd
    Jayaraman, Arun
    PLOS ONE, 2014, 9 (11):
  • [34] A Comprehensive Review on Audio based Musical Instrument Recognition: Human-Machine Interaction towards Industry 4.0
    Dash, Sukanta Kumar
    Solanki, S. S.
    Chakraborty, Soubhik
    JOURNAL OF SCIENTIFIC & INDUSTRIAL RESEARCH, 2023, 82 (01): : 26 - 37
  • [35] Utilizing Secondary Input from Passive Brain-Computer Interfaces for Enhancing Human-Machine Interaction
    Zander, T. O.
    Kothe, C.
    Welke, S.
    Roetting, M.
    FOUNDATIONS OF AUGMENTED COGNITION, PROCEEDINGS: NEUROERGONOMICS AND OPERATIONAL NEUROSCIENCE, 2009, 5638 : 759 - 771
  • [36] Toward an Enhanced Human-Machine Interface for Upper-Limb Prosthesis Control With Combined EMG and NIRS Signals
    Guo, Weichao
    Sheng, Xinjun
    Liu, Honghai
    Zhu, Xiangyang
    IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2017, 47 (04) : 564 - 575
  • [37] Surface Control and Electrical Tuning of MXene Electrode for Flexible Self-Powered Human-Machine Interaction
    Cai, Xu
    Xiao, Yu
    Zhang, Bingwen
    Yang, Yanhui
    Wang, Jun
    Chen, Huamin
    Shen, Guozhen
    ADVANCED FUNCTIONAL MATERIALS, 2023, 33 (43)
  • [38] Fandom meets artificial intelligence: Rethinking participatory culture as human-community-machine interactions
    Li, Eva Cheuk-Yin
    Pang, Ka-Wei
    EUROPEAN JOURNAL OF CULTURAL STUDIES, 2024, 27 (04) : 778 - 787
  • [39] Facial Emotion Recognition for Human-Machine Interaction using Hybrid DWT-SFET Feature Extraction Technique
    Kamal, Shoaib
    Sayeed, Farrukh
    Rafeeq, Mohammed
    Zakir, Mohammed
    2016 SECOND INTERNATIONAL CONFERENCE ON COGNITIVE COMPUTING AND INFORMATION PROCESSING (CCIP), 2016,
  • [40] Efficient and Robust Pupil Size and Blink Estimation from Near-Field Video Sequences for Human-Machine Interaction
    Chen, Siyuan
    Epps, Julien
    IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (12) : 2356 - 2367