Attention-Based Temporal Graph Representation Learning for EEG-Based Emotion Recognition

被引:5
作者
Li, Chao [1 ]
Wang, Feng [1 ]
Zhao, Ziping [1 ]
Wang, Haishuai [2 ]
Schuller, Bjorn W. [3 ,4 ]
机构
[1] Tianjin Normal Univ, Coll Comp & Informat Engn, Tianjin 300387, Peoples R China
[2] Zhejiang Univ Coll, Dept Comp Sci, Hangzhou 310058, Peoples R China
[3] Univ Augsburg, Chair Embedded Intelligence Hlth Care & Wellbeing, D-86159 Augsburg, Germany
[4] Imperial Coll London, GLAM, London SW7 2AZ, England
基金
中国国家自然科学基金;
关键词
Electroencephalography; Feature extraction; Emotion recognition; Convolution; Brain modeling; Electrodes; Graph neural networks; Affective computing; attention mechan- ism; EEG; emotion recognition; graph convolution network; CLASSIFICATION;
D O I
10.1109/JBHI.2024.3395622
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Due to the objectivity of emotional expression in the central nervous system, EEG-based emotion recognition can effectively reflect humans' internal emotional states. In recent years, convolutional neural networks (CNNs) and recurrent neural networks (RNNs) have made significant strides in extracting local features and temporal dependencies from EEG signals. However, CNNs ignore spatial distribution information from EEG electrodes; moreover, RNNs may encounter issues such as exploding/vanishing gradients and high time consumption. To address these limitations, we propose an attention-based temporal graph representation network (ATGRNet) for EEG-based emotion recognition. Firstly, a hierarchical attention mechanism is introduced to integrate feature representations from both frequency bands and channels ordered by priority in EEG signals. Second, a graph convolutional neural network with top-k operation is utilized to capture internal relationships between EEG electrodes under different emotion patterns. Next, a residual-based graph readout mechanism is applied to accumulate the EEG feature node-level representations into graph-level representations. Finally, the obtained graph-level representations are fed into a temporal convolutional network (TCN) to extract the temporal dependencies between EEG frames. We evaluated our proposed ATGRNet on the SEED, DEAP and FACED datasets. The experimental findings show that the proposed ATGRNet surpasses the state-of-the-art graph-based mehtods for EEG-based emotion recognition.
引用
收藏
页码:5755 / 5767
页数:13
相关论文
共 68 条
[1]   Emotions Recognition Using EEG Signals: A Survey [J].
Alarcao, Soraia M. ;
Fonseca, Manuel J. .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2019, 10 (03) :374-393
[2]  
Bai SJ, 2018, Arxiv, DOI [arXiv:1803.01271, 10.48550/arXiv.1803.01271, DOI 10.48550/ARXIV.1803.01271]
[3]   Multi-domain fusion deep graph convolution neural network for EEG emotion recognition [J].
Bi, Jinying ;
Wang, Fei ;
Yan, Xin ;
Ping, Jingyu ;
Wen, Yongzhao .
NEURAL COMPUTING & APPLICATIONS, 2022, 34 (24) :22241-22255
[4]   Increasing associative plasticity in temporo-occipital back-projections improves visual perception of emotions [J].
Borgomaneri, Sara ;
Zanon, Marco ;
Di Luzio, Paolo ;
Cataneo, Antonio ;
Arcara, Giorgio ;
Romei, Vincenzo ;
Tamietto, Marco ;
Avenanti, Alessio .
NATURE COMMUNICATIONS, 2023, 14 (01)
[5]  
Bruna J, 2014, Arxiv, DOI arXiv:1312.6203
[6]   Recurrent Neural Networks for Multivariate Time Series with Missing Values [J].
Che, Zhengping ;
Purushotham, Sanjay ;
Cho, Kyunghyun ;
Sontag, David ;
Liu, Yan .
SCIENTIFIC REPORTS, 2018, 8
[7]   A Large Finer-grained Affective Computing EEG Dataset [J].
Chen, Jingjing ;
Wang, Xiaobin ;
Huang, Chen ;
Hu, Xin ;
Shen, Xinke ;
Zhang, Dan .
SCIENTIFIC DATA, 2023, 10 (01)
[8]  
Chung JY, 2014, Arxiv, DOI [arXiv:1412.3555, DOI 10.48550/ARXIV.1412.3555]
[9]  
Defferrard M, 2016, ADV NEUR IN, V29
[10]   Neural modulation enhancement using connectivity-based EEG neurofeedback with simultaneous fMRI for emotion regulation [J].
Dehghani, Amin ;
Soltanian-Zadeh, Hamid ;
Hossein-Zadeh, Gholam-Ali .
NEUROIMAGE, 2023, 279