GNN-based multi-source domain prototype representation for cross-subject EEG emotion recognition

被引:2
作者
Guo, Yi [1 ,2 ]
Tang, Chao [1 ,2 ]
Wu, Hao [3 ]
Chen, Badong [1 ,2 ]
机构
[1] Xi An Jiao Tong Univ, Natl Engn Res Ctr Visual Informat & Applicat, Natl Key Lab Human Machine Hybrid Augmented Intell, Xian 710049, Peoples R China
[2] Xi An Jiao Tong Univ, Inst Artificial Intelligence & Robot, Xian 710049, Peoples R China
[3] Xian Univ Technol, Sch Elect Engn, Xian 710048, Peoples R China
基金
中国国家自然科学基金;
关键词
Electroencephalography; Emotion recognition; Graph neural network; Transfer learning; Multi-source domain; ADAPTATION;
D O I
10.1016/j.neucom.2024.128445
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Emotion recognition based on electroencephalography (EEG) signals is a major area of affective computing. However, the existence of distributional differences between subjects has greatly hindered the large-scale application of EEG emotion recognition techniques. Most of the existing cross-subject methods primarily concentrate on treating multiple subjects as a single source domain. These methods lead to significant distributional differences within the source domain, which hinder the model's ability to generalise effectively to target subjects. In this paper, we propose a new method that combines graph neural network-based prototype representation of multiple source domains with clustering similarity loss. It consists of three parts: multi-source domain prototype representation, graph neural network and loss. Multi-source domain prototype representation treats different subjects in the source domain as sub-source domains and extracts prototype features, which learns a more fine-grained feature representation. Graph neural network can better model the association properties between prototypes and samples. In addition, we propose a similarity loss based on clustering idea. The loss makes maximum use of similarity between samples in the target domain while ensuring that the classification performance does not degrade. We conduct extensive experiments on two benchmark datasets, SEED and SEED IV. The experimental results validate the effectiveness of the proposed multi-source domain fusion approach and indicate its superiority over existing methods in cross-subject classification tasks.
引用
收藏
页数:12
相关论文
共 52 条
[1]   ECG Pattern Analysis for Emotion Detection [J].
Agrafioti, Foteini ;
Hatzinakos, Dimitrios ;
Anderson, Adam K. .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2012, 3 (01) :102-115
[2]   Emotions Recognition Using EEG Signals: A Survey [J].
Alarcao, Soraia M. ;
Fonseca, Manuel J. .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2019, 10 (03) :374-393
[3]   Genetic algorithms reveal profound individual differences in emotion recognition [J].
Binetti, Nicola ;
Roubtsova, Nadejda ;
Carlisi, Christina ;
Cosker, Darren ;
Viding, Essi ;
Mareschal, Isabelle .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2022, 119 (45)
[4]   A Fast, Efficient Domain Adaptation Technique for Cross-Domain Electroencephalography(EEG)-Based Emotion Recognition [J].
Chai, Xin ;
Wang, Qisong ;
Zhao, Yongping ;
Li, Yongqiang ;
Liu, Dan ;
Liu, Xin ;
Bai, Ou .
SENSORS, 2017, 17 (05)
[5]  
Chang D, 2023, Arxiv, DOI arXiv:2308.10713
[6]   EEG-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network [J].
Cui, Heng ;
Liu, Aiping ;
Zhang, Xu ;
Chen, Xiang ;
Wang, Kongqiao ;
Chen, Xun .
KNOWLEDGE-BASED SYSTEMS, 2020, 205
[7]  
Defferrard M, 2016, ADV NEUR IN, V29
[8]   An Efficient LSTM Network for Emotion Recognition From Multichannel EEG Signals [J].
Du, Xiaobing ;
Ma, Cuixia ;
Zhang, Guanhua ;
Li, Jinyao ;
Lai, Yu-Kun ;
Zhao, Guozhen ;
Deng, Xiaoming ;
Liu, Yong-Jin ;
Wang, Hongan .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2022, 13 (03) :1528-1540
[9]  
Duan RN, 2013, I IEEE EMBS C NEUR E, P81, DOI 10.1109/NER.2013.6695876
[10]   Multi-Source Domain Transfer Discriminative Dictionary Learning Modeling for Electroencephalogram-Based Emotion Recognition [J].
Gu, Xiaoqing ;
Cai, Weiwei ;
Gao, Ming ;
Jiang, Yizhang ;
Ning, Xin ;
Qian, Pengjiang .
IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2022, 9 (06) :1604-1612