Graph Convolutional Network With Connectivity Uncertainty for EEG-Based Emotion Recognition

被引:8
作者
Gao, Hongxiang [1 ,2 ]
Wang, Xingyao [1 ,3 ]
Chen, Zhenghua [2 ]
Wu, Min [2 ]
Cai, Zhipeng [1 ]
Zhao, Lulu [1 ]
Li, Jianqing [1 ,4 ]
Liu, Chengyu [1 ]
机构
[1] Southeast Univ, Sch Instrument Sci & Engn, State Key Lab Digital Med Engn, Nanjing 210096, Peoples R China
[2] ASTAR, Inst Infocomm Res, Singapore 138632, Singapore
[3] ASTAR, Inst High Performance Comp, Singapore 138632, Singapore
[4] Nanjing Med Univ, Sch Biomed Engn & Informat, Nanjing 211166, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Electroencephalography; Uncertainty; Emotion recognition; Convolution; Brain modeling; Feature extraction; Noise measurement; EEG; connectivity uncertainty; graph neural network;
D O I
10.1109/JBHI.2024.3416944
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Automatic emotion recognition based on multichannel Electroencephalography (EEG) holds great potential in advancing human-computer interaction. However, several significant challenges persist in existing research on algorithmic emotion recognition. These challenges include the need for a robust model to effectively learn discriminative node attributes over long paths, the exploration of ambiguous topological information in EEG channels and effective frequency bands, and the mapping between intrinsic data qualities and provided labels. To address these challenges, this study introduces the distribution-based uncertainty method to represent spatial dependencies and temporal-spectral relativeness in EEG signals based on Graph Convolutional Network (GCN) architecture that adaptively assigns weights to functional aggregate node features, enabling effective long-path capturing while mitigating over-smoothing phenomena. Moreover, the graph mixup technique is employed to enhance latent connected edges and mitigate noisy label issues. Furthermore, we integrate the uncertainty learning method with deep GCN weights in a one-way learning fashion, termed Connectivity Uncertainty GCN (CU-GCN). We evaluate our approach on two widely used datasets, namely SEED and SEEDIV, for emotion recognition tasks. The experimental results demonstrate the superiority of our methodology over previous methods, yielding positive and significant improvements. Ablation studies confirm the substantial contributions of each component to the overall performance.
引用
收藏
页码:5917 / 5928
页数:12
相关论文
共 63 条
[31]   From Regional to Global Brain: A Novel Hierarchical Spatial-Temporal Neural Network Model for EEG Emotion Recognition [J].
Li, Yang ;
Zheng, Wenming ;
Wang, Lei ;
Zong, Yuan ;
Cui, Zhen .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2022, 13 (02) :568-578
[32]   EEG-Based Emotion Recognition in Music Listening [J].
Lin, Yuan-Pin ;
Wang, Chi-Hong ;
Jung, Tzyy-Ping ;
Wu, Tien-Lin ;
Jeng, Shyh-Kang ;
Duann, Jeng-Ren ;
Chen, Jyh-Horng .
IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2010, 57 (07) :1798-1806
[33]   Real-Time Movie-Induced Discrete Emotion Recognition from EEG Signals [J].
Liu, Yong-Jin ;
Yu, Minjing ;
Zhao, Guozhen ;
Song, Jinjing ;
Ge, Yan ;
Shi, Yuanchun .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2018, 9 (04) :550-562
[34]   A general framework for quantifying aleatoric and epistemic uncertainty in graph neural networks [J].
Munikoti, Sai ;
Agarwal, Deepesh ;
Das, Laya ;
Natarajan, Balasubramaniam .
NEUROCOMPUTING, 2023, 521 :1-10
[35]   fNIRS-based brain-computer interfaces: a review [J].
Naseer, Noman ;
Hong, Keum-Shik .
FRONTIERS IN HUMAN NEUROSCIENCE, 2015, 9
[36]  
Niepert M, 2016, PR MACH LEARN RES, V48
[37]   Affective picture processing: An integrative review of ERP findings [J].
Olofsson, Jonas K. ;
Nordin, Steven ;
Sequeira, Henrique ;
Polich, John .
BIOLOGICAL PSYCHOLOGY, 2008, 77 (03) :247-265
[38]   Frontal EEG Asymmetry of Mood: A Mini-Review [J].
Palmiero, Massimiliano ;
Piccardi, Laura .
FRONTIERS IN BEHAVIORAL NEUROSCIENCE, 2017, 11
[39]  
Paul S, 2015, 2015 INTERNATIONAL CONFERENCE ON ROBOTICS, AUTOMATION, CONTROL AND EMBEDDED SYSTEMS (RACE)
[40]  
Shi LC, 2013, IEEE ENG MED BIO, P6627, DOI 10.1109/EMBC.2013.6611075