Automatically Extracting and Utilizing EEG Channel Importance Based on Graph Convolutional Network for Emotion Recognition

被引:3
作者
Yang, Kun [1 ,2 ]
Yao, Zhenning [1 ,2 ]
Zhang, Keze [1 ,2 ]
Xu, Jing [3 ]
Zhu, Li [1 ,2 ]
Cheng, Shichao [1 ,2 ]
Zhang, Jianhai [1 ,2 ]
机构
[1] Hangzhou Dianzi Univ, Sch Comp Sci & Technol, Hangzhou 310018, Peoples R China
[2] Key Lab Brain Machine Collaborat Intelligence Zhej, Hangzhou 310018, Peoples R China
[3] Zhejiang Gongshang Univ, Sch Stat & Math, Hangzhou 310018, Peoples R China
关键词
Brain modeling; Emotion recognition; Electroencephalography; Feature extraction; Convolution; Data mining; Task analysis; EEG; emotion recognition; graph convolu- tional network (GCN); core network; channel importance; channel convolution; SENTIMENT CLASSIFICATION;
D O I
10.1109/JBHI.2024.3404146
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Graph convolutional network (GCN) based on the brain network has been widely used for EEG emotion recognition. However, most studies train their models directly without considering network dimensionality reduction beforehand. In fact, some nodes and edges are invalid information or even interference information for the current task. It is necessary to reduce the network dimension and extract the core network. To address the problem of extracting and utilizing the core network, a core network extraction model (CWGCN) based on channel weighting and graph convolutional network and a graph convolutional network model (CCSR-GCN) based on channel convolution and style-based recalibration for emotion recognition have been proposed. The CWGCN model automatically extracts the core network and the channel importance parameter in a data-driven manner. The CCSR-GCN model innovatively uses the output information of the CWGCN model to identify the emotion state. The experimental results on SEED show that: 1) the core network extraction can help improve the performance of the GCN model; 2) the models of CWGCN and CCSR-GCN achieve better results than the currently popular methods. The idea and its implementation in this paper provide a novel and successful perspective for the application of GCN in brain network analysis of other specific tasks.
引用
收藏
页码:4588 / 4598
页数:11
相关论文
共 48 条
[1]   EEG-Based Multi-Modal Emotion Recognition using Bag of Deep Features: An Optimal Feature Selection Approach [J].
Asghar, Muhammad Adeel ;
Khan, Muhammad Jamil ;
Fawad ;
Amin, Yasar ;
Rizwan, Muhammad ;
Rahman, MuhibUr ;
Badnava, Salman ;
Mirjavadi, Seyed Sajad .
SENSORS, 2019, 19 (23)
[2]   Linking Multi-Layer Dynamical GCN With Style-Based Recalibration CNN for EEG-Based Emotion Recognition [J].
Bao, Guangcheng ;
Yang, Kai ;
Tong, Li ;
Shu, Jun ;
Zhang, Rongkai ;
Wang, Linyuan ;
Yan, Bin ;
Zeng, Ying .
FRONTIERS IN NEUROROBOTICS, 2022, 16
[3]   XGBoost: A Scalable Tree Boosting System [J].
Chen, Tianqi ;
Guestrin, Carlos .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :785-794
[4]   EEG emotion recognition based on Ordinary Differential Equation Graph Convolutional Networks and Dynamic Time Wrapping [J].
Chen, Yiyuan ;
Xu, Xiaodong ;
Bian, Xiaoyi ;
Qin, Xiaowei .
APPLIED SOFT COMPUTING, 2024, 152
[5]   Rhythm-Dependent Multilayer Brain Network for the Detection of Driving Fatigue [J].
Dang, Weidong ;
Gao, Zhongke ;
Lv, Dongmei ;
Sun, Xinlin ;
Cheng, Chichao .
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2021, 25 (03) :693-700
[6]  
Defferrard M, 2016, ADV NEUR IN, V29
[7]   Light-weight residual convolution-based capsule network for EEG emotion recognition [J].
Fan, Cunhang ;
Wang, Jinqin ;
Huang, Wei ;
Yang, Xiaoke ;
Pei, Guangxiong ;
Li, Taihao ;
Lv, Zhao .
ADVANCED ENGINEERING INFORMATICS, 2024, 61
[8]  
Fan J, 2015, IEEE ENG MED BIO, P3767, DOI 10.1109/EMBC.2015.7319213
[9]   Core-Brain-Network-Based Multilayer Convolutional Neural Network for Emotion Recognition [J].
Gao, Zhongke ;
Li, Rumei ;
Ma, Chao ;
Rui, Linge ;
Sun, Xinlin .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2021, 70
[10]  
Graves A, 2012, STUD COMPUT INTELL, V385, P1, DOI [10.1162/neco.1997.9.1.1, 10.1007/978-3-642-24797-2]