Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features†

被引:20
作者
Jin, Longbin [1 ]
Kim, Eun Yi [1 ]
机构
[1] Konkuk Univ, Comp Sci & Engn, Seoul 05029, South Korea
关键词
EEG; cross-subject; emotion recognition; user independent model; channel-wise feature;
D O I
10.3390/s20236719
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Electroencephalogram (EEG)-based emotion recognition is receiving significant attention in research on brain-computer interfaces (BCI) and health care. To recognize cross-subject emotion from EEG data accurately, a technique capable of finding an effective representation robust to the subject-specific variability associated with EEG data collection processes is necessary. In this paper, a new method to predict cross-subject emotion using time-series analysis and spatial correlation is proposed. To represent the spatial connectivity between brain regions, a channel-wise feature is proposed, which can effectively handle the correlation between all channels. The channel-wise feature is defined by a symmetric matrix, the elements of which are calculated by the Pearson correlation coefficient between two-pair channels capable of complementarily handling subject-specific variability. The channel-wise features are then fed to two-layer stacked long short-term memory (LSTM), which can extract temporal features and learn an emotional model. Extensive experiments on two publicly available datasets, the Dataset for Emotion Analysis using Physiological Signals (DEAP) and the SJTU (Shanghai Jiao Tong University) Emotion EEG Dataset (SEED), demonstrate the effectiveness of the combined use of channel-wise features and LSTM. Experimental results achieve state-of-the-art classification rates of 98.93% and 99.10% during the two-class classification of valence and arousal in DEAP, respectively, with an accuracy of 99.63% during three-class classification in SEED.
引用
收藏
页码:1 / 18
页数:18
相关论文
共 26 条
[1]   Emotions Recognition Using EEG Signals: A Survey [J].
Alarcao, Soraia M. ;
Fonseca, Manuel J. .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2019, 10 (03) :374-393
[2]  
[Anonymous], 2017, REAL TIME CONVOLUTIO
[3]  
Candra H, 2015, IEEE ENG MED BIO, P7250, DOI 10.1109/EMBC.2015.7320065
[4]   Decoding the Attentional Demands of Gait through EEG Gamma Band Features [J].
Costa, Alvaro ;
Ianez, Eduardo ;
Ubeda, Andres ;
Hortal, Enrique ;
Del-Ama, Antonio J. ;
Gil-Agudo, Angel ;
Azorin, Jose M. .
PLOS ONE, 2016, 11 (04)
[5]  
Damasio A.R., 1995, DESCARTES ERROR EMOT
[6]  
Deng YL, 2019, PROCEEDINGS OF 2019 IEEE 3RD INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC 2019), P2642, DOI [10.1109/itnec.2019.8729424, 10.1109/ITNEC.2019.8729424]
[7]   Cross-Subject Emotion Recognition Using Flexible Analytic Wavelet Transform From EEG Signals [J].
Gupta, Vipin ;
Chopda, Mayur Dahyabhai ;
Pachori, Ram Bilas .
IEEE SENSORS JOURNAL, 2019, 19 (06) :2266-2274
[8]   Deep Physiological Affect Network for the Recognition of Human Emotions [J].
Kim, Byung Hyung ;
Jo, Sungho .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2020, 11 (02) :230-243
[9]   Affective Body Expression Perception and Recognition: A Survey [J].
Kleinsmith, Andrea ;
Bianchi-Berthouze, Nadia .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2013, 4 (01) :15-33
[10]   DEAP: A Database for Emotion Analysis Using Physiological Signals [J].
Koelstra, Sander ;
Muhl, Christian ;
Soleymani, Mohammad ;
Lee, Jong-Seok ;
Yazdani, Ashkan ;
Ebrahimi, Touradj ;
Pun, Thierry ;
Nijholt, Anton ;
Patras, Ioannis .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2012, 3 (01) :18-31