EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks

被引:940
作者
Song, Tengfei [1 ,2 ]
Zheng, Wenming [1 ,2 ]
Song, Peng [3 ]
Cui, Zhen [4 ]
机构
[1] Southeast Univ, Key Lab Child Dev & Learning Sci, Minist Educ, Nanjing 210096, Peoples R China
[2] Southeast Univ, Sch Informat Sci & Engn, Nanjing 210096, Peoples R China
[3] Yantai Univ, Sch Comp & Control Engn, Yantai 264005, Peoples R China
[4] Nanjing Univ Sci & Technol, Sch Comp Sci, Nanjing 210094, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
Electroencephalography; Emotion recognition; Brain modeling; Feature extraction; Convolutional neural networks; Convolution; Biological neural networks; EEG emotion recognition; adjacency matrix; graph convolutional neural networks (GCNN); dynamical convolutional neural networks (DGCNN); EXPRESSION;
D O I
10.1109/TAFFC.2018.2817622
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, a multichannel EEG emotion recognition method based on a novel dynamical graph convolutional neural networks (DGCNN) is proposed. The basic idea of the proposed EEG emotion recognition method is to use a graph to model the multichannel EEG features and then perform EEG emotion classification based on this model. Different from the traditional graph convolutional neural networks (GCNN) methods, the proposed DGCNN method can dynamically learn the intrinsic relationship between different electroencephalogram (EEG) channels, represented by an adjacency matrix, via training a neural network so as to benefit for more discriminative EEG feature extraction. Then, the learned adjacency matrix is used to learn more discriminative features for improving the EEG emotion recognition. We conduct extensive experiments on the SJTU emotion EEG dataset (SEED) and DREAMER dataset. The experimental results demonstrate that the proposed method achieves better recognition performance than the state-of-the-art methods, in which the average recognition accuracy of 90.4 percent is achieved for subject dependent experiment while 79.95 percent for subject independent cross-validation one on the SEED database, and the average accuracies of 86.23, 84.54 and 85.02 percent are respectively obtained for valence, arousal and dominance classifications on the DREAMER database.
引用
收藏
页码:532 / 541
页数:10
相关论文
共 60 条
[1]   Analysis of evoked EEG synchronization and desynchronization in conditions of emotional activation in humans: Temporal and topographic characteristics [J].
Aftanas L.I. ;
Reva N.V. ;
Varlamov A.A. ;
Pavlov S.V. ;
Makhnev V.P. .
Neuroscience and Behavioral Physiology, 2004, 34 (8) :859-867
[2]   ECG Pattern Analysis for Emotion Detection [J].
Agrafioti, Foteini ;
Hatzinakos, Dimitrios ;
Anderson, Adam K. .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2012, 3 (01) :102-115
[3]   A real-time automated system for the recognition of human facial expressions [J].
Anderson, K ;
McOwan, PW .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2006, 36 (01) :96-105
[4]  
Ang J., 2002, P INT C SPOK LANG PR, P2037
[5]  
[Anonymous], 2019, IEEE T AFFECT COMPUT, DOI DOI 10.1109/TAFFC.2017.2714671
[6]  
[Anonymous], 2015, EMOTION RECOGNITION
[7]  
[Anonymous], 2019, IEEE T AFFECT COMPUT, DOI DOI 10.1109/TAFFC.2017.2713359
[8]  
Cheng B., 2008, P 2 INT C BIOINF BIO, P1363
[9]   Dry and Noncontact EEG Sensors for Mobile Brain-Computer Interfaces [J].
Chi, Yu Mike ;
Wang, Yu-Te ;
Wang, Yijun ;
Maier, Christoph ;
Jung, Tzyy-Ping ;
Cauwenberghs, Gert .
IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2012, 20 (02) :228-235
[10]  
Chung F. R., 1997, SPECTRAL GRAPH THEOR