Leveraging spatial-temporal convolutional features for EEG-based emotion recognition

被引:41
作者
An, Yi [1 ]
Xu, Ning [1 ]
Qu, Zhen [1 ]
机构
[1] Tibet Univ, Sch Informat Sci, Lhasa, Peoples R China
关键词
EEG; Emotion recognition; DCNN; Attention; ConvLSTM; FEATURE-SELECTION; CLASSIFICATION;
D O I
10.1016/j.bspc.2021.102743
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
The electroencephalogram (EEG) signal is a medium to realize a brain-computer interface (BCI) system due to its zero clinical risk and portable acquisition devices. As deep learning technology has been considered to obtain a great success towards solving various vision-based research problems such as affective computing. Therefore, in the present paper, a novel framework for EEG-based emotion recognition is proposed. The framework consists of two modules. The first module is deep convolutional neural network (DCNN) architecture, which can represent the inter-channel correlation among physically adjacent EEG signals by converting the chain-like EEG sequence into 2D frame sequences. The second module is ConvLSTM, which can represent the sequence information of the EEG data samples. After that, the features of DCNN and ConvLSTM are concatenated and represented by attention mechanism for final emotion recognition. Extensive experiments conducted on the DEAP database demonstrate that: (1) The proposed framework effectively improves the accuracies of both emotion classification, with arousal dimension up to 87.69%, which is higher than the most of the state-of-the-art methods. (2) The dimension of valence also obtains comparable emotion recognition performance with the accuracy of 87.84%, which surpass the most of the state-of-the-art methods.
引用
收藏
页数:8
相关论文
共 34 条
[1]  
[Anonymous], IEEE T AFFECT COMPUT
[2]  
[Anonymous], IEEE Transactions on Affective Computing
[3]   Improving BCI-based emotion recognition by combining EEG feature selection and kernel classifiers [J].
Atkinson, John ;
Campos, Daniel .
EXPERT SYSTEMS WITH APPLICATIONS, 2016, 47 :35-41
[4]  
Du X., IEEE T AFFECT COMPUT
[5]  
Guo YL, 2017, IEEE IMAGE PROC, P3295, DOI 10.1109/ICIP.2017.8296892
[6]   Discriminative Spatiotemporal Local Binary Pattern with Revisited Integral Projection for Spontaneous Facial Micro-Expression Recognition [J].
Huang, Xiaohua ;
Wang, Su-Jing ;
Liu, Xin ;
Zhao, Guoying ;
Feng, Xiaoyi ;
Pietikainen, Matti .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2019, 10 (01) :32-47
[7]   Computational modelling of visual attention [J].
Itti, L ;
Koch, C .
NATURE REVIEWS NEUROSCIENCE, 2001, 2 (03) :194-203
[8]   A feature selection method based on a neighborhood approach for contending with functional and anatomical variability in fMRI group analysis of cognitive states [J].
Juarez-Castillo, Efren ;
Gabriel Acosta-Mesa, Hector ;
Fernandez-Ruiz, Juan ;
Cruz-Ramirez, Nicandro .
INTELLIGENT DATA ANALYSIS, 2017, 21 (03) :661-677
[9]   Combining Modality Specific Deep Neural Networks for Emotion Recognition in Video [J].
Kahou, Samira Ebrahimi ;
Pal, Christopher ;
Bouthillier, Xavier ;
Froumenty, Pierre ;
Gulcehre, Caglar ;
Memisevic, Roland ;
Vincent, Pascal ;
Courville, Aaron ;
Bengio, Yoshua .
ICMI'13: PROCEEDINGS OF THE 2013 ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2013, :543-550
[10]   DEAP: A Database for Emotion Analysis Using Physiological Signals [J].
Koelstra, Sander ;
Muhl, Christian ;
Soleymani, Mohammad ;
Lee, Jong-Seok ;
Yazdani, Ashkan ;
Ebrahimi, Touradj ;
Pun, Thierry ;
Nijholt, Anton ;
Patras, Ioannis .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2012, 3 (01) :18-31