Facial Communicative Signals Valence Recognition in Task-Oriented Human-Robot Interaction

被引:4
|
作者
Lang, Christian [1 ]
Wachsmuth, Sven [2 ]
Hanheide, Marc [3 ]
Wersing, Heiko [4 ]
机构
[1] Univ Bielefeld, Res Inst Cognit & Robot CoR Lab, D-33615 Bielefeld, Germany
[2] Univ Bielefeld, D-33615 Bielefeld, Germany
[3] Lincoln Univ, Sch Comp Sci, Lincoln, England
[4] Honda Res Inst Europe, Offenbach, Germany
关键词
Facial communicative signals; Valence recognition; Head gestures; Eye gaze; Facial expressions; Object teaching; Active appearance models; EMOTION RECOGNITION; GAZE DIRECTION; AUTOMATIC-ANALYSIS; CIRCUMPLEX MODEL; VISUAL BEHAVIOR; SOCIAL-CONTEXT; EXPRESSIONS; DEFINITIONS; UNIVERSALS; PERCEPTION;
D O I
10.1007/s12369-012-0145-z
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
This paper investigates facial communicative signals ( head gestures, eye gaze, and facial expressions) as nonverbal feedback in human-robot interaction. Motivated by a discussion of the literature, we suggest scenario-specific investigations due to the complex nature of these signals and present an object-teaching scenario where subjects teach the names of objects to a robot, which in turn shall term these objects correctly afterwards. The robot's verbal answers are to elicit facial communicative signals of its interaction partners. We investigated the human ability to recognize this spontaneous facial feedback and also the performance of two automatic recognition approaches. The first one is a static approach yielding baseline results, whereas the second considers the temporal dynamics and achieved classification rates comparable to the human performance.
引用
收藏
页码:249 / 262
页数:14
相关论文
共 50 条
  • [41] Gesture spotting and recognition for human-robot interaction
    Yang, Hee-Deok
    Park, A-Yeon
    Lee, Seong-Whan
    IEEE TRANSACTIONS ON ROBOTICS, 2007, 23 (02) : 256 - 270
  • [42] Recognition in Human-Robot Interaction: The Gateway to Engagement
    Brinck, Ingar
    Balkenius, Christian
    2019 JOINT IEEE 9TH INTERNATIONAL CONFERENCE ON DEVELOPMENT AND LEARNING AND EPIGENETIC ROBOTICS (ICDL-EPIROB), 2019, : 31 - 36
  • [43] Facial Emotion Expressions in Human-Robot Interaction: A Survey
    Rawal, Niyati
    Stock-Homburg, Ruth Maria
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2022, 14 (07) : 1583 - 1604
  • [44] Active Affective Facial Analysis For Human-Robot Interaction
    Ge, Shuzhi Sam
    Samani, Hooman Aghaebrahimi
    Ong, Yin Hao Janus
    Hang, Chang Chieh
    2008 17TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, VOLS 1 AND 2, 2008, : 83 - 88
  • [45] Human-Robot Interaction based on Facial Expression Imitation
    Esfandbod, Alireza
    Rokhi, Zeynab
    Taheri, Alireza
    Alemi, Minoo
    Meghdari, Ali
    2019 7TH INTERNATIONAL CONFERENCE ON ROBOTICS AND MECHATRONICS (ICROM 2019), 2019, : 69 - 73
  • [46] Attention-Oriented Action Recognition for Real-Time Human-Robot Interaction
    Song, Ziyang
    Yin, Ziyi
    Yuan, Zejian
    Zhang, Chong
    Chi, Wanchao
    Ling, Yonggen
    Zhang, Shenghao
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 7087 - 7094
  • [47] Interaction Task Motion Learning for Human-Robot Interaction Control
    Lyu, Shangke
    Selvaraj, Nithish Muthuchamy
    Cheah, Chien Chern
    IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2022, 52 (05) : 894 - 906
  • [48] Scene-Dependent Intention Recognition for Task Communication with Reduced Human-Robot Interaction
    Duncan, Kester
    Sarkar, Sudeep
    Alqasemi, Redwan
    Dubey, Rajiv
    COMPUTER VISION - ECCV 2014 WORKSHOPS, PT III, 2015, 8927 : 730 - 745
  • [49] Synthesizing facial expressions in dyadic human-robot interaction
    Sham, Abdallah Hussein
    Tikka, Pia
    Lamas, David
    Anbarjafari, Gholamreza
    SIGNAL IMAGE AND VIDEO PROCESSING, 2024, 18 (SUPPL 1) : 909 - 918
  • [50] Tracking of Facial Features to Support Human-Robot Interaction
    Pateraki, Maria
    Baltzakis, Haris
    Kondaxakis, Polychronis
    Trahanias, Panos
    ICRA: 2009 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-7, 2009, : 2651 - 2656