Multisource Associate Domain Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition

被引:35
|
作者
She, Qingshan [1 ]
Zhang, Chenqi [2 ]
Fang, Feng [3 ]
Ma, Yuliang [1 ]
Zhang, Yingchun [3 ]
机构
[1] Hangzhou Dianzi Univ, Sch Automat, Hangzhou 310018, Zhejiang, Peoples R China
[2] Hangzhou Dianzi Univ, HDU ITMO Joint Inst, Hangzhou 310018, Zhejiang, Peoples R China
[3] Univ Houston, Dept Biomed Engn, Houston, TX 77204 USA
基金
中国国家自然科学基金;
关键词
Feature extraction; Emotion recognition; Electroencephalography; Brain modeling; Adaptation models; Data models; Data mining; Domain adaptation (DA); electroencephalogram (EEG); emotion recognition; transfer learning; DIFFERENTIAL ENTROPY FEATURE;
D O I
10.1109/TIM.2023.3277985
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Emotion recognition is important in the application of brain-computer interface (BCI). Building a robust emotion recognition model across subjects and sessions is critical in emotion-based BCI systems. Electroencephalogram (EEG) is a widely used tool to recognize different emotion states. However, EEG has disadvantages such as small amplitude, low signal-to-noise ratio, and nonstationary properties, resulting in large differences across subjects. To solve these problems, this article proposes a new emotion recognition method based on a multisource associate domain adaptation (DA) network, considering both domain invariant and domain-specific features. First, separate branches were constructed for multiple source domains, assuming that different EEG data shared the same low-level features. Second, the domain-specific features were extracted using the one-to-one associate DA. Then, the weighted scores of specific sources were obtained according to the distribution distance, and multiple source classifiers were deduced with the corresponding weighted scores. Finally, EEG emotion recognition experiments were conducted on different datasets, including SEED, DEAP, and SEED-IV dataset. Results indicated that, in the cross-subject experiment, the average accuracy in SEED dataset was 86.16%, DEAP dataset was 65.59%, and SEED-IV was 59.29%. In the cross-session experiment, the accuracies of SEED and SEED-IV datasets were 91.10% and 66.68%, respectively. Our proposed method has achieved better classification results compared to the state-of-the-art DA methods.
引用
收藏
页数:12
相关论文
共 50 条
  • [41] JOINT TEMPORAL CONVOLUTIONAL NETWORKS AND ADVERSARIAL DISCRIMINATIVE DOMAIN ADAPTATION FOR EEG-BASED CROSS-SUBJECT EMOTION RECOGNITION
    He, Zhipeng
    Zhong, Yongshi
    Pan, Jiahui
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3214 - 3218
  • [42] Orthogonal semi-supervised regression with adaptive label dragging for cross-session EEG emotion recognition
    Sha, Tianhui
    Peng, Yong
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2023, 35 (04) : 139 - 151
  • [43] Cross-Subject Cognitive Workload Recognition Based on EEG and Deep Domain Adaptation
    Zhou, Yueying
    Wang, Pengpai
    Gong, Peiliang
    Wei, Fulin
    Wen, Xuyun
    Wu, Xia
    Zhang, Daoqiang
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [44] Joint Feature Adaptation and Graph Adaptive Label Propagation for Cross-Subject Emotion Recognition From EEG Signals
    Peng, Yong
    Wang, Wenjuan
    Kong, Wanzeng
    Nie, Feiping
    Lu, Bao-Liang
    Cichocki, Andrzej
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2022, 13 (04) : 1941 - 1958
  • [45] Cross-Subject Emotion Recognition Using Fused Entropy Features of EEG
    Zuo, Xin
    Zhang, Chi
    Hamalainen, Timo
    Gao, Hanbing
    Fu, Yu
    Cong, Fengyu
    ENTROPY, 2022, 24 (09)
  • [46] A deep multi-source adaptation transfer network for cross-subject electroencephalogram emotion recognition
    Fei Wang
    Weiwei Zhang
    Zongfeng Xu
    Jingyu Ping
    Hao Chu
    Neural Computing and Applications, 2021, 33 : 9061 - 9073
  • [47] A deep multi-source adaptation transfer network for cross-subject electroencephalogram emotion recognition
    Wang, Fei
    Zhang, Weiwei
    Xu, Zongfeng
    Ping, Jingyu
    Chu, Hao
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (15) : 9061 - 9073
  • [48] Fusing Frequency-Domain Features and Brain Connectivity Features for Cross-Subject Emotion Recognition
    Chen, Chuangquan
    Li, Zhencheng
    Wan, Feng
    Xu, Leicai
    Bezerianos, Anastasios
    Wang, Hongtao
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [49] Cross-Subject Emotion Recognition From Multichannel EEG Signals Using Multivariate Decomposition and Ensemble Learning
    Vempati, Raveendrababu
    Sharma, Lakhan Dev
    Tripathy, Rajesh Kumar
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2025, 17 (01) : 77 - 88
  • [50] CiABL: Completeness-Induced Adaptative Broad Learning for Cross-Subject Emotion Recognition With EEG and Eye Movement Signals
    Gong, Xinrong
    Chen, C. L. Philip
    Hu, Bin
    Zhang, Tong
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2024, 15 (04) : 1970 - 1984