Bi-modal emotion recognition from expressive face and body gestures

被引:220
作者
Gunes, Hatice
Piccardi, Massimo
机构
[1] Computer Vision Research Group, Faculty of Information Technology, University of Technology, Sydney (UTS), Broadway, NSW, 2007
关键词
Bi-modal emotion recognition; facial expression; expressive body gestures; feature-level fusion; decision-level fusion;
D O I
10.1016/j.jnca.2006.09.007
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Psychological research findings suggest that humans rely on the combined visual channels of face and body more than any other channel when they make judgments about human communicative behavior. However, most of the existing systems attempting to analyze the human nonverbal behavior are mono-modal and focus only on the face. Research that aims to integrate gestures as an expression mean has only recently emerged. Accordingly, this paper presents an approach to automatic visual recognition of expressive face and upper-body gestures from video sequences suitable for use in a vision-based affective multi-modal framework. Face and body movements are captured simultaneously using two separate cameras. For each video sequence single expressive frames both from face and body are selected manually for analysis and recognition of emotions. Firstly, individual classifiers are trained from individual modalities. Secondly, we fuse facial expression and affective body gesture information at the feature and at the decision level. In the experiments performed, the emotion classification using the two modalities achieved a better recognition accuracy outperforming classification using the individual facial or bodily modality alone. (c) 2006 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1334 / 1345
页数:12
相关论文
共 17 条
[1]   THIN SLICES OF EXPRESSIVE BEHAVIOR AS PREDICTORS OF INTERPERSONAL CONSEQUENCES - A METAANALYSIS [J].
AMBADY, N ;
ROSENTHAL, R .
PSYCHOLOGICAL BULLETIN, 1992, 111 (02) :256-274
[2]  
[Anonymous], DATA MINING PRACTICA
[3]  
Balomenos T, 2005, LECT NOTES COMPUT SC, V3361, P318
[4]  
BRADSKI GR, 1998, INTEL TECHNOLOGY J 2
[5]  
Brave S., 2002, The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications
[6]  
BURGOON NK, 2005, P INT C INT DAT AN
[7]   Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence [J].
Coulson, M .
JOURNAL OF NONVERBAL BEHAVIOR, 2004, 28 (02) :117-139
[8]  
Ekman Paul, 2003, Unmasking the face: A guide to recognizing emotions from facial clues
[9]  
Gunes H, 2006, INT C PATT RECOG, P1148
[10]   Creating and annotating affect Databases from face and body display: A contemporary survey [J].
Gunes, Hatice ;
Piccardi, Massimo .
2006 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, VOLS 1-6, PROCEEDINGS, 2006, :2426-+