An Attention-Based Multi-Domain Bi-Hemisphere Discrepancy Feature Fusion Model for EEG Emotion Recognition

被引:2
|
作者
Gong, Linlin [1 ]
Chen, Wanzhong [1 ]
Zhang, Dingguo [2 ]
机构
[1] Jilin Univ, Coll Commun Engn, Jilin 130025, Peoples R China
[2] Univ Bath, Dept Elect & Elect Engn, Bath BA2 7AY, England
关键词
Electroencephalography; Feature extraction; Brain modeling; Emotion recognition; Convolution; Electrodes; Convolutional neural networks; Electroencephalogram (EEG)-based emotion recognition; brain hemisphere asymmetry; pesudo-3D residual convolution neural network; attention mechanism; feature fusion; NETWORK; SIGNALS;
D O I
10.1109/JBHI.2024.3418010
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Electroencephalogram (EEG)-based emotion recognition has become a research hotspot in the field of brain-computer interface. Previous emotion recognition methods have overlooked the fusion of multi-domain emotion-specific information to improve performance, and faced the challenge of insufficient interpretability. In this paper, we proposed a novel EEG emotion recognition model that combined the asymmetry of the brain hemisphere, and the spatial, spectral, and temporal multi-domain properties of EEG signals, aiming to improve emotion recognition performance. Based on the 10-20 standard system, a global spatial projection matrix (GSPM) and a bi-hemisphere discrepancy projection matrix (BDPM) are constructed. A dual-stream spatial-spectral-temporal convolution neural network is designed to extract depth features from the two matrix paradigms. Finally, the transformer-based fusion module is used to learn the dependence of fused features, and to retain the discriminative information. We conducted extensive experiments on the SEED, SEED-IV, and DEAP public datasets, achieving excellent average results of 98.33/2.46 % , 92.15/5.13 % , 97.60/1.68 % (valence), and 97.48/1.42 % (arousal) respectively. Visualization analysis supports the interpretability of the model, and ablation experiments validate the effectiveness of multi-domain and bi-hemisphere discrepancy information fusion.
引用
收藏
页码:5890 / 5903
页数:14
相关论文
共 50 条
  • [1] A Bi-Hemisphere Domain Adversarial Neural Network Model for EEG Emotion Recognition
    Li, Yang
    Zheng, Wenming
    Zong, Yuan
    Cui, Zhen
    Zhang, Tong
    Zhou, Xiaoyan
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2021, 12 (02) : 494 - 504
  • [2] Differences first in asymmetric brain: A bi-hemisphere discrepancy convolutional neural network for EEG emotion recognition
    Huang, Dongmin
    Chen, Sentao
    Liu, Cheng
    Zheng, Lin
    Tian, Zhihang
    Jiang, Dazhi
    NEUROCOMPUTING, 2021, 448 : 140 - 151
  • [3] Research on Emotion Recognition Based on EEG Time-Frequency-Spatial Multi-Domain Feature Fusion
    Wang, Lu
    Liang, Mingjing
    Shi, Huiyu
    Wen, Xin
    Cao, Rui
    Computer Engineering and Applications, 2024, 59 (04) : 191 - 196
  • [4] Fusion of Multi-domain EEG Signatures Improves Emotion Recognition
    Wang, Xiaomin
    Pei, Yu
    Luo, Zhiguo
    Zhao, Shaokai
    Xie, Liang
    Yan, Ye
    Yin, Erwei
    Liu, Shuang
    Ming, Dong
    JOURNAL OF INTEGRATIVE NEUROSCIENCE, 2024, 23 (01)
  • [5] Bi-hemisphere asymmetric attention network: recognizing emotion from EEG signals based on the transformer
    Zhong, Xinyue
    Gu, Yun
    Luo, Yutong
    Zeng, Xiaomei
    Liu, Guangyuan
    APPLIED INTELLIGENCE, 2023, 53 (12) : 15278 - 15294
  • [6] Bi-hemisphere asymmetric attention network: recognizing emotion from EEG signals based on the transformer
    Xinyue Zhong
    Yun Gu
    Yutong Luo
    Xiaomei Zeng
    Guangyuan Liu
    Applied Intelligence, 2023, 53 : 15278 - 15294
  • [7] MSBiLSTM-Attention: EEG Emotion Recognition Model Based on Spatiotemporal Feature Fusion
    Ma, Yahong
    Huang, Zhentao
    Yang, Yuyao
    Chen, Zuowen
    Dong, Qi
    Zhang, Shanwen
    Li, Yuan
    BIOMIMETICS, 2025, 10 (03)
  • [8] Active sonar target recognition method based on multi-domain transformations and attention-based fusion network
    Wang, Qingcui
    Du, Shuanping
    Zhang, Wei
    Wang, Fangyong
    IET RADAR SONAR AND NAVIGATION, 2024, 18 (10): : 1814 - 1828
  • [9] Multi-analysis domain feature fusion of EEG emotion recognition based on integrated deep learning model
    Chao H.
    Liu Y.-L.
    Lian W.-F.
    Kongzhi yu Juece/Control and Decision, 2020, 35 (07): : 1674 - 1680
  • [10] An attention-based hybrid deep learning model for EEG emotion recognition
    Zhang, Yong
    Zhang, Yidie
    Wang, Shuai
    SIGNAL IMAGE AND VIDEO PROCESSING, 2023, 17 (05) : 2305 - 2313