Multisource Associate Domain Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition

被引:35
|
作者
She, Qingshan [1 ]
Zhang, Chenqi [2 ]
Fang, Feng [3 ]
Ma, Yuliang [1 ]
Zhang, Yingchun [3 ]
机构
[1] Hangzhou Dianzi Univ, Sch Automat, Hangzhou 310018, Zhejiang, Peoples R China
[2] Hangzhou Dianzi Univ, HDU ITMO Joint Inst, Hangzhou 310018, Zhejiang, Peoples R China
[3] Univ Houston, Dept Biomed Engn, Houston, TX 77204 USA
基金
中国国家自然科学基金;
关键词
Feature extraction; Emotion recognition; Electroencephalography; Brain modeling; Adaptation models; Data models; Data mining; Domain adaptation (DA); electroencephalogram (EEG); emotion recognition; transfer learning; DIFFERENTIAL ENTROPY FEATURE;
D O I
10.1109/TIM.2023.3277985
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Emotion recognition is important in the application of brain-computer interface (BCI). Building a robust emotion recognition model across subjects and sessions is critical in emotion-based BCI systems. Electroencephalogram (EEG) is a widely used tool to recognize different emotion states. However, EEG has disadvantages such as small amplitude, low signal-to-noise ratio, and nonstationary properties, resulting in large differences across subjects. To solve these problems, this article proposes a new emotion recognition method based on a multisource associate domain adaptation (DA) network, considering both domain invariant and domain-specific features. First, separate branches were constructed for multiple source domains, assuming that different EEG data shared the same low-level features. Second, the domain-specific features were extracted using the one-to-one associate DA. Then, the weighted scores of specific sources were obtained according to the distribution distance, and multiple source classifiers were deduced with the corresponding weighted scores. Finally, EEG emotion recognition experiments were conducted on different datasets, including SEED, DEAP, and SEED-IV dataset. Results indicated that, in the cross-subject experiment, the average accuracy in SEED dataset was 86.16%, DEAP dataset was 65.59%, and SEED-IV was 59.29%. In the cross-session experiment, the accuracies of SEED and SEED-IV datasets were 91.10% and 66.68%, respectively. Our proposed method has achieved better classification results compared to the state-of-the-art DA methods.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] MASS: A Multisource Domain Adaptation Network for Cross-Subject Touch Gesture Recognition
    Li, Yun-Kai
    Meng, Qing-Hao
    Wang, Ya-Xin
    Yang, Tian-Hao
    Hou, Hui-Rang
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (03) : 3099 - 3108
  • [32] CFDA-CSF: A Multi-Modal Domain Adaptation Method for Cross-Subject Emotion Recognition
    Jimenez-Guarneros, Magdiel
    Fuentes-Pineda, Gibran
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2024, 15 (03) : 1502 - 1513
  • [33] Joint EEG Feature Transfer and Semisupervised Cross-Subject Emotion Recognition
    Peng, Yong
    Liu, Honggang
    Kong, Wanzeng
    Nie, Feiping
    Lu, Bao-Liang
    Cichocki, Andrzej
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (07) : 8104 - 8115
  • [34] Domain Adaptation for EEG Emotion Recognition Based on Latent Representation Similarity
    Li, Jinpeng
    Qiu, Shuang
    Du, Changde
    Wang, Yixin
    He, Huiguang
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2020, 12 (02) : 344 - 353
  • [35] GNN-based multi-source domain prototype representation for cross-subject EEG emotion recognition
    Guo, Yi
    Tang, Chao
    Wu, Hao
    Chen, Badong
    NEUROCOMPUTING, 2024, 609
  • [36] Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition
    Shen, Xinke
    Liu, Xianggen
    Hu, Xin
    Zhang, Dan
    Song, Sen
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (03) : 2496 - 2511
  • [37] Cross-Session Emotion Recognition by Joint Label-Common and Label-Specific EEG Features Exploration
    Peng, Yong
    Liu, Honggang
    Li, Junhua
    Huang, Jun
    Lu, Bao-Liang
    Kong, Wanzeng
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2023, 31 : 759 - 768
  • [38] Cross-Subject Emotion Recognition Using Deep Adaptation Networks
    Li, He
    Jin, Yi-Ming
    Zheng, Wei-Long
    Lu, Bao-Liang
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT V, 2018, 11305 : 403 - 413
  • [39] Cross-subject and Cross-gender Emotion Classification from EEG
    Zhu, Jia-Yi
    Zheng, Wei-Long
    Lu, Bao-Liang
    WORLD CONGRESS ON MEDICAL PHYSICS AND BIOMEDICAL ENGINEERING, 2015, VOLS 1 AND 2, 2015, 51 : 1188 - 1191
  • [40] Cross-subject EEG emotion recognition using multi-source domain manifold feature selection
    She, Qingshan
    Shi, Xinsheng
    Fang, Feng
    Ma, Yuliang
    Zhang, Yingchun
    COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 159