An Attention-Based Multi-Domain Bi-Hemisphere Discrepancy Feature Fusion Model for EEG Emotion Recognition

被引:6
作者
Gong, Linlin [1 ]
Chen, Wanzhong [1 ]
Zhang, Dingguo [2 ]
机构
[1] Jilin Univ, Coll Commun Engn, Jilin 130025, Peoples R China
[2] Univ Bath, Dept Elect & Elect Engn, Bath BA2 7AY, England
关键词
Electroencephalography; Feature extraction; Brain modeling; Emotion recognition; Convolution; Electrodes; Convolutional neural networks; Electroencephalogram (EEG)-based emotion recognition; brain hemisphere asymmetry; pesudo-3D residual convolution neural network; attention mechanism; feature fusion; NETWORK; SIGNALS;
D O I
10.1109/JBHI.2024.3418010
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Electroencephalogram (EEG)-based emotion recognition has become a research hotspot in the field of brain-computer interface. Previous emotion recognition methods have overlooked the fusion of multi-domain emotion-specific information to improve performance, and faced the challenge of insufficient interpretability. In this paper, we proposed a novel EEG emotion recognition model that combined the asymmetry of the brain hemisphere, and the spatial, spectral, and temporal multi-domain properties of EEG signals, aiming to improve emotion recognition performance. Based on the 10-20 standard system, a global spatial projection matrix (GSPM) and a bi-hemisphere discrepancy projection matrix (BDPM) are constructed. A dual-stream spatial-spectral-temporal convolution neural network is designed to extract depth features from the two matrix paradigms. Finally, the transformer-based fusion module is used to learn the dependence of fused features, and to retain the discriminative information. We conducted extensive experiments on the SEED, SEED-IV, and DEAP public datasets, achieving excellent average results of 98.33/2.46 % , 92.15/5.13 % , 97.60/1.68 % (valence), and 97.48/1.42 % (arousal) respectively. Visualization analysis supports the interpretability of the model, and ablation experiments validate the effectiveness of multi-domain and bi-hemisphere discrepancy information fusion.
引用
收藏
页码:5890 / 5903
页数:14
相关论文
共 63 条
[1]   Automated Feature Extraction on AsMap for Emotion Classification Using EEG [J].
Ahmed, Md Zaved Iqubal ;
Sinha, Nidul ;
Phadikar, Souvik ;
Ghaderpour, Ebrahim .
SENSORS, 2022, 22 (06)
[2]   Automated Depression Detection Using Deep Representation and Sequence Learning with EEG Signals [J].
Ay, Betul ;
Yildirim, Ozal ;
Talo, Muhammed ;
Baloglu, Ulas Baran ;
Aydin, Galip ;
Puthankattil, Subha D. ;
Acharya, U. Rajendra .
JOURNAL OF MEDICAL SYSTEMS, 2019, 43 (07)
[3]   Comparison of hemispheric asymmetry measurements for emotional recordings from controls [J].
Aydin, Serap ;
Demirtas, Serdar ;
Tunga, M. Alper ;
Ates, Kahraman .
NEURAL COMPUTING & APPLICATIONS, 2018, 30 (04) :1341-1351
[4]  
Bhardwaj A, 2015, 2ND INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND INTEGRATED NETWORKS (SPIN) 2015, P180, DOI 10.1109/SPIN.2015.7095376
[5]   Hemispheric Asymmetry of Functional Brain Networks under Different Emotions Using EEG Data [J].
Cao, Rui ;
Shi, Huiyu ;
Wang, Xin ;
Huo, Shoujun ;
Hao, Yan ;
Wang, Bin ;
Guo, Hao ;
Xiang, Jie .
ENTROPY, 2020, 22 (09)
[6]   EEG-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network [J].
Cui, Heng ;
Liu, Aiping ;
Zhang, Xu ;
Chen, Xiang ;
Wang, Kongqiao ;
Chen, Xun .
KNOWLEDGE-BASED SYSTEMS, 2020, 205
[7]   The emotional brain [J].
Dalgleish, T .
NATURE REVIEWS NEUROSCIENCE, 2004, 5 (07) :583-589
[8]  
Davidson Richard J., 1995, P361
[9]  
Demaree Heath A, 2005, Behav Cogn Neurosci Rev, V4, P3, DOI 10.1177/1534582305276837
[10]   TSception:A Deep Learning Framework for Emotion Detection Using EEG [J].
Ding, Yi ;
Robinson, Neethu ;
Zeng, Qiuhao ;
Chen, Duo ;
Wai, Aung Aung Phyo ;
Lee, Tih-Shih ;
Guan, Cuntai .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,