An Attention-Based Multi-Domain Bi-Hemisphere Discrepancy Feature Fusion Model for EEG Emotion Recognition

被引:2
|
作者
Gong, Linlin [1 ]
Chen, Wanzhong [1 ]
Zhang, Dingguo [2 ]
机构
[1] Jilin Univ, Coll Commun Engn, Jilin 130025, Peoples R China
[2] Univ Bath, Dept Elect & Elect Engn, Bath BA2 7AY, England
关键词
Electroencephalography; Feature extraction; Brain modeling; Emotion recognition; Convolution; Electrodes; Convolutional neural networks; Electroencephalogram (EEG)-based emotion recognition; brain hemisphere asymmetry; pesudo-3D residual convolution neural network; attention mechanism; feature fusion; NETWORK; SIGNALS;
D O I
10.1109/JBHI.2024.3418010
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Electroencephalogram (EEG)-based emotion recognition has become a research hotspot in the field of brain-computer interface. Previous emotion recognition methods have overlooked the fusion of multi-domain emotion-specific information to improve performance, and faced the challenge of insufficient interpretability. In this paper, we proposed a novel EEG emotion recognition model that combined the asymmetry of the brain hemisphere, and the spatial, spectral, and temporal multi-domain properties of EEG signals, aiming to improve emotion recognition performance. Based on the 10-20 standard system, a global spatial projection matrix (GSPM) and a bi-hemisphere discrepancy projection matrix (BDPM) are constructed. A dual-stream spatial-spectral-temporal convolution neural network is designed to extract depth features from the two matrix paradigms. Finally, the transformer-based fusion module is used to learn the dependence of fused features, and to retain the discriminative information. We conducted extensive experiments on the SEED, SEED-IV, and DEAP public datasets, achieving excellent average results of 98.33/2.46 % , 92.15/5.13 % , 97.60/1.68 % (valence), and 97.48/1.42 % (arousal) respectively. Visualization analysis supports the interpretability of the model, and ablation experiments validate the effectiveness of multi-domain and bi-hemisphere discrepancy information fusion.
引用
收藏
页码:5890 / 5903
页数:14
相关论文
共 50 条
  • [31] Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset
    Khateeb, Muhammad
    Anwar, Syed Muhammad
    Alnowami, Majdi
    IEEE Access, 2021, 9 : 12134 - 12142
  • [32] Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset
    Khateeb, Muhammad
    Anwar, Syed Muhammad
    Alnowami, Majdi
    IEEE ACCESS, 2021, 9 : 12134 - 12142
  • [33] Attention-Based Multi-view Feature Fusion for Cross-Domain Recommendation
    Dai, Feifei
    Gu, Xiaoyan
    Wang, Zhuo
    Li, Bo
    Qian, Mingda
    Wang, Weiping
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 204 - 216
  • [34] An Attention-based Ensemble Model for Emotion Recognition in Conversation
    Farooq, Misbah
    De Silva, Varuna
    Shi, Xiyu
    2024 14TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION SYSTEMS, ICPRS, 2024,
  • [35] Attention-based interactive multi-level feature fusion for named entity recognition
    Xu, Yiwu
    Chen, Yun
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [36] AMFF: A new attention-based multi-feature fusion method for intention recognition
    Liu, Cong
    Xu, Xiaolong
    KNOWLEDGE-BASED SYSTEMS, 2021, 233
  • [37] EEG-based Emotion Recognition with Feature Fusion Networks
    Qiang Gao
    Yi Yang
    Qiaoju Kang
    Zekun Tian
    Yu Song
    International Journal of Machine Learning and Cybernetics, 2022, 13 : 421 - 429
  • [38] EEG-based Emotion Recognition with Feature Fusion Networks
    Gao, Qiang
    Yang, Yi
    Kang, Qiaoju
    Tian, Zekun
    Song, Yu
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (02) : 421 - 429
  • [39] Emotion Recognition in EEG Based on Multilevel Multidomain Feature Fusion
    Li, Zhao Long
    Cao, Hui
    Zhang, Ji Sai
    IEEE ACCESS, 2024, 12 : 87237 - 87247
  • [40] Attention-Based CNN Fusion Model for Emotion Recognition During Walking Using Discrete Wavelet Transform on EEG and Inertial Signals
    Zhao, Yan
    Guo, Ming
    Chen, Xiangyong
    Sun, Jianqiang
    Qiu, Jianlong
    BIG DATA MINING AND ANALYTICS, 2024, 7 (01): : 188 - 204