Emotion recognition with convolutional neural network and EEG-based EFDMs

被引:135
作者
Wang, Fei [1 ]
Wu, Shichao [1 ]
Zhang, Weiwei [1 ]
Xu, Zongfeng [2 ]
Zhang, Yahui [2 ]
Wu, Chengdong [1 ]
Coleman, Sonya [3 ]
机构
[1] Northeastern Univ, Fac Robot Sci & Engn, Shenyang 110169, Peoples R China
[2] Northeastern Univ, Coll Informat Sci & Engn, Shenyang 110819, Peoples R China
[3] Ulster Univ, Intelligent Syst Res Ctr, Newtownabbey, Londonderry, North Ireland
基金
中国国家自然科学基金;
关键词
Emotion recognition; Electroencephalogram; Convolutional neural network; Electrode-frequency distribution maps; Gradient-weighted class activation mapping; CLASSIFICATION; DEPRESSION; ASYMMETRY; IMPLICIT; MACHINE; ENTROPY; MUSIC;
D O I
10.1016/j.neuropsychologia.2020.107506
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
Electroencephalogram (EEG), as a direct response to brain activity, can be used to detect mental states and physical conditions. Among various EEG-based emotion recognition studies, due to the non-linear, non-stationary and the individual difference of EEG signals, traditional recognition methods still have the disadvantages of complicated feature extraction and low recognition rates. Thus, this paper first proposes a novel concept of electrode-frequency distribution maps (EFDMs) with short-time Fourier transform (STFT). Residual block based deep convolutional neural network (CNN) is proposed for automatic feature extraction and emotion classification with EFDMs. Aim at the shortcomings of the small amount of EEG samples and the challenge of differences in individual emotions, which makes it difficult to construct a universal model, this paper proposes a cross-datasets emotion recognition method of deep model transfer learning. Experiments carried out on two publicly available datasets. The proposed method achieved an average classification score of 90.59% based on a short length of EEG data on SEED, which is 4.51% higher than the baseline method. Then, the pre-trained model was applied to DEAP through deep model transfer learning with a few samples, resulted an average accuracy of 82.84%. Finally, this paper adopts the gradient weighted class activation mapping (Grad-CAM) to get a glimpse of what features the CNN has learned during training from EFDMs and concludes that the high frequency bands are more favorable for emotion recognition.
引用
收藏
页数:11
相关论文
共 46 条
[1]   Convolutional Neural Networks for Speech Recognition [J].
Abdel-Hamid, Ossama ;
Mohamed, Abdel-Rahman ;
Jiang, Hui ;
Deng, Li ;
Penn, Gerald ;
Yu, Dong .
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2014, 22 (10) :1533-1545
[2]   Predicting tDCS treatment outcomes of patients with major depressive disorder using automated EEG classification [J].
Al-Kaysi, Alaa M. ;
Al-Ani, Ahmed ;
Loo, Colleen K. ;
Powell, Tamara Y. ;
Martin, Donel M. ;
Breakspear, Michael ;
Boonstra, Tjeerd W. .
JOURNAL OF AFFECTIVE DISORDERS, 2017, 208 :597-603
[3]   Emotions Recognition Using EEG Signals: A Survey [J].
Alarcao, Soraia M. ;
Fonseca, Manuel J. .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2019, 10 (03) :374-393
[4]  
Ang A.Q., 2017, J. Comput. Commun., V5, P75, DOI [DOI 10.4236/JCC.2017.53009, 10.4236/jcc.2017.53009]
[5]   Depression and implicit emotion processing: An EEG study [J].
Bocharov, Andrey V. ;
Knyazev, Gennady G. ;
Savostyanov, Alexander N. .
NEUROPHYSIOLOGIE CLINIQUE-CLINICAL NEUROPHYSIOLOGY, 2017, 47 (03) :225-230
[6]   Independent Component Ensemble of EEG for Brain-Computer Interface [J].
Chuang, Chun-Hsiang ;
Ko, Li-Wei ;
Lin, Yuan-Pin ;
Jung, Tzyy-Ping ;
Lin, Chin-Teng .
IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2014, 22 (02) :230-238
[7]  
Dash M., 1997, Intelligent Data Analysis, V1
[8]  
Duan RN, 2012, LECT NOTES COMPUT SC, V7666, P468, DOI 10.1007/978-3-642-34478-7_57
[9]  
Duan RN, 2013, I IEEE EMBS C NEUR E, P81, DOI 10.1109/NER.2013.6695876
[10]   Decoding olfactory stimuli in EEG data using nonlinear features: A pilot study [J].
Ezzatdoost, Kiana ;
Hojjati, Hadi ;
Aghajan, Hamid .
JOURNAL OF NEUROSCIENCE METHODS, 2020, 341