DEAP: A Database for Emotion Analysis Using Physiological Signals

被引:3046
作者
Koelstra, Sander [1 ]
Muhl, Christian [2 ]
Soleymani, Mohammad [3 ]
Lee, Jong-Seok [4 ]
Yazdani, Ashkan [5 ]
Ebrahimi, Touradj [5 ]
Pun, Thierry [3 ]
Nijholt, Anton [2 ]
Patras, Ioannis [1 ]
机构
[1] Queen Mary Univ London, Sch Elect Engn & Comp Sci, London E1 4NS, England
[2] Univ Twente, Human Media Interact Grp, NL-7500 AE Enschede, Netherlands
[3] Univ Geneva, Dept Comp Sci, CH-1227 Geneva, Switzerland
[4] Yonsei Univ, Sch Integrated Technol, Seoul 120749, South Korea
[5] Ecole Polytech Fed Lausanne, Inst Elect Engn IEL, Multimedia Signal Proc Grp, CH-1015 Lausanne, Switzerland
关键词
Emotion classification; EEG; physiological signals; signal processing; pattern classification; affective computing; SELF-ASSESSMENT MANNEQUIN; EEG; AUDIO;
D O I
10.1109/T-AFFC.2011.15
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present a multimodal data set for the analysis of human affective states. The electroencephalogram (EEG) and peripheral physiological signals of 32 participants were recorded as each watched 40 one-minute long excerpts of music videos. Participants rated each video in terms of the levels of arousal, valence, like/dislike, dominance, and familiarity. For 22 of the 32 participants, frontal face video was also recorded. A novel method for stimuli selection is proposed using retrieval by affective tags from the last.fm website, video highlight detection, and an online assessment tool. An extensive analysis of the participants' ratings during the experiment is presented. Correlates between the EEG signal frequencies and the participants' ratings are investigated. Methods and results are presented for single-trial classification of arousal, valence, and like/ dislike ratings using the modalities of EEG, peripheral physiological signals, and multimedia content analysis. Finally, decision fusion of the classification results from different modalities is performed. The data set is made publicly available and we encourage other researchers to use it for testing their own affective state estimation methods.
引用
收藏
页码:18 / 31
页数:14
相关论文
共 59 条
[1]  
[Anonymous], 1999, International affective digitized sounds (IADS): Stimuli, instruction manual and affective ratings
[2]  
[Anonymous], 2005, Technical Report A-6
[3]  
[Anonymous], P INT C MUS INF RETR
[4]  
[Anonymous], 2000, P ISCA TUT RES WORKS
[5]   EEG differences between eyes-closed and eyes-open resting conditions [J].
Barry, Robert J. ;
Clarke, Adam R. ;
Johnstone, Stuart J. ;
Magee, Christopher A. ;
Rushby, Jacqueline A. .
CLINICAL NEUROPHYSIOLOGY, 2007, 118 (12) :2765-2773
[6]   EEG differences in children between eyes-closed and eyes-open resting conditions [J].
Barry, Robert J. ;
Clarke, Adam R. ;
Johnstone, Stuart J. ;
Brown, Christopher R. .
CLINICAL NEUROPHYSIOLOGY, 2009, 120 (10) :1806-1811
[7]  
Boersma P., 2013, Praat: doing phonetics by computer, DOI DOI 10.1097/AUD.0B013E31821473F7
[8]   MEASURING EMOTION - THE SELF-ASSESSMENT MANNEQUIN AND THE SEMANTIC DIFFERENTIAL [J].
BRADLEY, MM ;
LANG, PJ .
JOURNAL OF BEHAVIOR THERAPY AND EXPERIMENTAL PSYCHIATRY, 1994, 25 (01) :49-59
[9]   Short-term emotion assessment in a recall paradigm [J].
Chanel, Guillaume ;
Kierkels, Joep J. M. ;
Soleymani, Mohammad ;
Pun, Thierry .
INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 2009, 67 (08) :607-627
[10]   Mixed type audio classification with Support Vector Machine [J].
Chen, Lei ;
Gunduz, Sule ;
Ozsu, M. Tamer .
2006 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO - ICME 2006, VOLS 1-5, PROCEEDINGS, 2006, :781-+