Affective state estimation for human-robot interaction

被引:163
作者
Kulic, Dana [1 ]
Croft, Elizabeth A.
机构
[1] Univ Tokyo, Dept Mechanoinformat, Nakamura & Yamane Lab, Tokyo 1138656, Japan
[2] Univ British Columbia, Vancouver, BC V6T 1Z4, Canada
关键词
affective state estimation; human-robot interaction; physiological signals;
D O I
10.1109/TRO.2007.904899
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
In order for humans and robots to interact in an effective and intuitive manner, robots must obtain information about the human affective state in response to the robot's actions. This secondary mode of interactive communication is hypothesized to permit a more natural collaboration, similar to the "body language" interaction between two cooperating humans. This paper describes the implementation and validation of a hidden Markov model (HMM) for estimating human affective state in real time, using robot motions as the stimulus. Inputs to the system are physiological signals such as heart rate, perspiration rate, and facial muscle contraction. Affective state was estimated using a two-dimensional valence-arousal representation. A robot manipulator was used to generate motions expected during human-robot interaction, and human subjects were asked to report their response to these motions. The human physiological response was also measured. Robot motions were generated using both a nominal potential field planner and a recently reported safe motion planner that minimizes the potential collision forces along the path. The robot motions were tested with 36 subjects. This data was used to train and validate the HMM model. The results of the HMM affective estimation are also compared to a previously implemented fuzzy inference engine.
引用
收藏
页码:991 / 1000
页数:10
相关论文
共 45 条
[11]   Coding, analysis, interpretation, and recognition of facial expressions [J].
Essa, IA ;
Pentland, AP .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) :757-763
[12]   Humans and technologies at home: From friendly appliances to robotic interfaces [J].
Guglielmelli, E ;
Dario, P ;
Laschi, C ;
Fontanelli, R ;
Susani, M ;
Verbeeck, P ;
Gabus, JC .
RO-MAN '96 - 5TH IEEE INTERNATIONAL WORKSHOP ON ROBOT AND HUMAN COMMUNICATION, PROCEEDINGS, 1996, :71-79
[13]   Embodied symbol emergence based on mimesis theory [J].
Inamura, T ;
Toshima, I ;
Tanie, H ;
Nakamura, Y .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2004, 23 (4-5) :363-377
[14]   Development and evaluation of interactive humanoid robots [J].
Kanda, T ;
Ishiguro, H ;
Imai, M ;
Ono, T .
PROCEEDINGS OF THE IEEE, 2004, 92 (11) :1839-1850
[15]  
Kapoor A., P 13 ANN ACM INT C M, P677, DOI [DOI 10.1145/1101149.1101300, 10.1145/1101149.1101300]
[16]   INTELLIGENT ROBOTIC SYSTEMS IN-SERVICE OF THE DISABLED [J].
KAWAMURA, K ;
BAGCHI, S ;
ISKAROUS, M ;
BISHAY, M .
IEEE TRANSACTIONS ON REHABILITATION ENGINEERING, 1995, 3 (01) :14-21
[18]   Emotion recognition system using short-term monitoring of physiological signals [J].
Kim, KH ;
Bang, SW ;
Kim, SR .
MEDICAL & BIOLOGICAL ENGINEERING & COMPUTING, 2004, 42 (03) :419-427
[19]  
Koay KL, 2005, 2005 IEEE INTERNATIONAL WORKSHOP ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), P359
[20]   Anxiety detection during human-robot interaction [J].
Kulic, D ;
Croft, E .
2005 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2005, :389-394