Multimodal Emotion Recognition in Response to Videos

被引:474
作者
Soleymani, Mohammad [1 ]
Pantic, Maja [2 ,3 ]
Pun, Thierry [1 ]
机构
[1] Univ Geneva, Dept Comp Sci, Comp Vis & Multimedia Lab, CH-1227 Carouge, GE, Switzerland
[2] Univ London Imperial Coll Sci Technol & Med, Dept Comp, London SW7 2AZ, England
[3] Univ Twente, Fac Elect Engn Math & Comp Sci, NL-7522 NB Enschede, Netherlands
基金
欧洲研究理事会; 瑞士国家科学基金会;
关键词
Emotion recognition; EEG; pupillary reflex; pattern classification; affective computing; PUPIL LIGHT REFLEX; CLASSIFICATION; OSCILLATIONS; SYSTEMS;
D O I
10.1109/T-AFFC.2011.37
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a user-independent emotion recognition method with the goal of recovering affective tags for videos using electroencephalogram (EEG), pupillary response and gaze distance. We first selected 20 video clips with extrinsic emotional content from movies and online resources. Then, EEG responses and eye gaze data were recorded from 24 participants while watching emotional video clips. Ground truth was defined based on the median arousal and valence scores given to clips in a preliminary study using an online questionnaire. Based on the participants' responses, three classes for each dimension were defined. The arousal classes were calm, medium aroused, and activated and the valence classes were unpleasant, neutral, and pleasant. One of the three affective labels of either valence or arousal was determined by classification of bodily responses. A one-participant-out cross validation was employed to investigate the classification performance in a user-independent approach. The best classification accuracies of 68.5 percent for three labels of valence and 76.4 percent for three labels of arousal were obtained using a modality fusion strategy and a support vector machine. The results over a population of 24 participants demonstrate that user-independent emotion recognition can outperform individual self-reports for arousal assessments and do not underperform for valence assessments.
引用
收藏
页码:211 / 223
页数:13
相关论文
共 54 条
[1]   Dissociable neural systems for recognizing emotions [J].
Adolphs, R ;
Tranel, D ;
Damasio, AR .
BRAIN AND COGNITION, 2003, 52 (01) :61-69
[2]   Analysis of evoked EEG synchronization and desynchronization in conditions of emotional activation in humans: Temporal and topographic characteristics [J].
Aftanas L.I. ;
Reva N.V. ;
Varlamov A.A. ;
Pavlov S.V. ;
Makhnev V.P. .
Neuroscience and Behavioral Physiology, 2004, 34 (8) :859-867
[3]  
[Anonymous], P ACM INT C IM VID R
[4]  
[Anonymous], 2005, CHI 2005 WORKSH EV A
[5]  
Arapakis I., 2009, Proceedings of the 17th ACM International Conference on Multimedia, P461, DOI DOI 10.1145/1631272.1631336
[6]   HIPPUS OF PUPIL - PERIODS OF SLOW OSCILLATIONS OF UNKNOWN ORIGIN [J].
BOUMA, H ;
BAGHUIS, LCJ .
VISION RESEARCH, 1971, 11 (11) :1345-&
[7]   The pupil as a measure of emotional arousal and autonomic activation [J].
Bradley, Margaret M. ;
Miccoli, Laura ;
Escrig, Miguel A. ;
Lang, Peter J. .
PSYCHOPHYSIOLOGY, 2008, 45 (04) :602-607
[8]   Emotion Assessment From Physiological Signals for Adaptation of Game Difficulty [J].
Chanel, Guillaume ;
Rebetez, Cyril ;
Betrancourt, Mireille ;
Pun, Thierry .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART A-SYSTEMS AND HUMANS, 2011, 41 (06) :1052-1063
[9]   Short-term emotion assessment in a recall paradigm [J].
Chanel, Guillaume ;
Kierkels, Joep J. M. ;
Soleymani, Mohammad ;
Pun, Thierry .
INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 2009, 67 (08) :607-627
[10]   LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)