Semi-supervised EEG emotion recognition model based on enhanced graph fusion and GCN

被引:20
作者
Li, Guangqiang [1 ]
Chen, Ning [1 ]
Jin, Jing [1 ]
机构
[1] East China Univ Sci & Technol, Sch Informat Sci & Engn, Shanghai 200237, Peoples R China
基金
中国国家自然科学基金;
关键词
EEG; emotion recognition; Graph Convolutional Network (GCN); graph fusion; network enhancement; CLASSIFICATION; EXPRESSIONS; NETWORKS; SIGNALS; DOMAIN;
D O I
10.1088/1741-2552/ac63ec
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Objective. To take full advantage of both labeled data and unlabeled ones, the Graph Convolutional Network (GCN) was introduced in electroencephalography (EEG) based emotion recognition to achieve feature propagation. However, a single feature cannot represent the emotional state entirely and precisely due to the instability of the EEG signal and the complexity of the emotional state. In addition, the noise existing in the graph may affect the performance greatly. To solve these problems, it was necessary to introduce feature/similarity fusion and noise reduction strategies. Approach. A semi-supervised EEG emotion recognition model combining graph fusion, network enhancement, and feature fusion was proposed. Firstly, different features were extracted from EEG and then compacted by Principal Component Analysis (PCA), respectively. Secondly, a Sample-by-sample Similarity Matrix (SSM) was constructed based on each feature, and similarity network fusion (SNF) was adopted to fuse the graphs corresponding to different SSMs to take advantage of their complementarity. Then, Network Enhancement (NE) was performed on the fused graph to reduce the noise in it. Finally, GCN was performed on the concatenated features and the enhanced fused graph to achieve feature propagation. Main results. Experimental results demonstrated that: (a) When 5.30% of SEED and 7.20% of SEED-IV samples were chosen as the labeled samples, respectively, the minimum classification accuracy improvement achieved by the proposed scheme over state-of-the-art schemes were 1.52% on SEED and 13.14% on SEED-IV, respectively. (b) When 8.00% of SEED and 9.60% of SEED-IV samples were chosen as the labeled samples, respectively, the minimum training time reduction achieved by the proposed scheme over state-of-the-art schemes were 46.75 s and 22.55 s, respectively. (c) Graph fusion, network enhancement, and feature fusion all contributed to the performance enhancement. (d) The key hyperparameters that affect the performance were relatively few and easy to set to obtain outstanding performance. Significance. This paper demonstrated that the combination of graph fusion, network enhancement, and feature fusion help to enhance GCN-based EEG emotion recognition.
引用
收藏
页数:14
相关论文
共 55 条
[1]   Emotion Recognition Based on Fusion of Local Cortial Activations and Dynamic Functional Networks Connectivity: An EEG Study [J].
Al-Shargie, Fares ;
Tariq, Usman ;
Alex, Meera ;
Mir, Hasan ;
Al-Nashash, Hasan .
IEEE ACCESS, 2019, 7 (143550-143562) :143550-143562
[2]   Discrimination of Genuine and Acted Emotional Expressions Using EEG Signal and Machine Learning [J].
Alex, Meera ;
Tariq, Usman ;
Al-Shargie, Fares ;
Mir, Hasan S. ;
Al Nashash, Hasan .
IEEE ACCESS, 2020, 8 :191080-191089
[3]   Deep Learning Classification of Neuro-Emotional Phase Domain Complexity Levels Induced by Affective Video Film Clips [J].
Aydin, Serap .
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2020, 24 (06) :1695-1702
[4]   Cortical correlations in wavelet domain for estimation of emotional dysfunctions [J].
Aydin, Serap ;
Demirtas, Serdar ;
Yetkin, Sinan .
NEURAL COMPUTING & APPLICATIONS, 2018, 30 (04) :1085-1094
[5]   Joint auto-weighted graph fusion and scalable semi-supervised learning [J].
Bahrami, Saeedeh ;
Dornaika, Fadi ;
Bosaghzadeh, Alireza .
INFORMATION FUSION, 2021, 66 :213-228
[6]   A Hierarchical Bidirectional GRU Model With Attention for EEG-Based Emotion Classification [J].
Chen, J. X. ;
Jiang, D. M. ;
Zhang, N. .
IEEE ACCESS, 2019, 7 :118530-118540
[7]   Emotion recognition in human-computer interaction [J].
Cowie, R ;
Douglas-Cowie, E ;
Tsapatsoulis, N ;
Votsis, G ;
Kollias, S ;
Fellenz, W ;
Taylor, JG .
IEEE SIGNAL PROCESSING MAGAZINE, 2001, 18 (01) :32-80
[8]  
Defferrard M, 2016, ADV NEUR IN, V29
[9]   TSception:A Deep Learning Framework for Emotion Detection Using EEG [J].
Ding, Yi ;
Robinson, Neethu ;
Zeng, Qiuhao ;
Chen, Duo ;
Wai, Aung Aung Phyo ;
Lee, Tih-Shih ;
Guan, Cuntai .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
[10]  
Ekman P, 1984, APPROACHES EMOTION, V1st, P448, DOI [10.4324/9781315798806, DOI 10.4324/9781315798806]