STSNet: a novel spatio-temporal-spectral network for subject-independent EEG-based emotion recognition

被引:11
作者
Li, Rui [1 ]
Ren, Chao [1 ]
Zhang, Sipo [1 ]
Yang, Yikun [1 ]
Zhao, Qiqi [1 ]
Hou, Kechen [1 ]
Yuan, Wenjie [1 ]
Zhang, Xiaowei [1 ]
Hu, Bin [1 ]
机构
[1] Lanzhou Univ, Sch Informat Sci & Engn, Gansu Prov Key Lab Wearable Comp, Lanzhou 730000, Gansu, Peoples R China
基金
中国国家自然科学基金;
关键词
Emotion recognition; EEG; Spatio-temporal-spectral feature fusion; Riemannian manifold; ManifoldNet; BiLSTM; FEATURES; FUSION;
D O I
10.1007/s13755-023-00226-x
中图分类号
R-058 [];
学科分类号
摘要
How to use the characteristics of EEG signals to obtain more complementary and discriminative data representation is an issue in EEG-based emotion recognition. Many studies have tried spatio-temporal or spatio-spectral feature fusion to obtain higher-level representations of EEG data. However, these studies ignored the complementarity between spatial, temporal and spectral domains of EEG signals, thus limiting the classification ability of models. This study proposed an end-to-end network based on ManifoldNet and BiLSTM networks, named STSNet. The STSNet first constructed a 4-D spatio-temporal-spectral data representation and a spatio-temporal data representation based on EEG signals in manifold space. After that, they were fed into the ManifoldNet network and the BiLSTM network respectively to calculate higher-level features and achieve spatio-temporal-spectral feature fusion. Finally, extensive comparative experiments were performed on two public datasets, DEAP and DREAMER, using the subject-independent leave-one-subject-out cross-validation strategy. On the DEAP dataset, the average accuracy of the valence and arousal are 69.38% and 71.88%, respectively; on the DREAMER dataset, the average accuracy of the valence and arousal are 78.26% and 82.37%, respectively. Experimental results show that the STSNet model has good emotion recognition performance.
引用
收藏
页数:16
相关论文
共 54 条
[1]   Subject-Dependent Emotion Recognition System Based on Multidimensional Electroencephalographic Signals: A Riemannian Geometry Approach [J].
Abdel-Ghaffar, Eman A. ;
Wu, Yujin ;
Daoudi, Mohamed .
IEEE ACCESS, 2022, 10 :14993-15006
[2]  
Ali M, 2016, INT CONF UBIQ FUTUR, P946, DOI 10.1109/ICUFN.2016.7536936
[3]   Subject independent emotion recognition using EEG signals employing attention driven neural networks [J].
Arjun ;
Rajpoot, Aniket Singh ;
Panicker, Mahesh Raveendranatha .
BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 75
[4]   Log-euclidean metrics for fast and simple calculus on diffusion tensors [J].
Arsigny, Vincent ;
Fillard, Pierre ;
Pennec, Xavier ;
Ayache, Nicholas .
MAGNETIC RESONANCE IN MEDICINE, 2006, 56 (02) :411-421
[5]   Calibration free meta learning based approach for subject independent EEG emotion recognition [J].
Bhosale, Swapnil ;
Chakraborty, Rupayan ;
Kopparapu, Sunil Kumar .
BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 72
[6]   ManifoldNet: A Deep Neural Network for Manifold-Valued Data With Applications [J].
Chakraborty, Rudrasis ;
Bouza, Jose ;
Manton, Jonathan ;
Vemuri, Baba C. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (02) :799-810
[7]   Where Does EEG Come From and What Does It Mean? [J].
Cohen, Michael X. .
TRENDS IN NEUROSCIENCES, 2017, 40 (04) :208-218
[8]   EEG-Based Subject-Independent Emotion Recognition Using Gated Recurrent Unit and Minimum Class Confusion [J].
Cui, Heng ;
Liu, Aiping ;
Zhang, Xu ;
Chen, Xiang ;
Liu, Jun ;
Chen, Xun .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (04) :2740-2750
[9]   Automated robust human emotion classification system using hybrid EEG features with ICBrainDB dataset [J].
Deniz, Erkan ;
Sobahi, Nebras ;
Omar, Naaman ;
Sengur, Abdulkadir ;
Acharya, U. Rajendra .
HEALTH INFORMATION SCIENCE AND SYSTEMS, 2022, 10 (01)
[10]   TSception: Capturing Temporal Dynamics and Spatial Asymmetry From EEG for Emotion Recognition [J].
Ding, Yi ;
Robinson, Neethu ;
Zhang, Su ;
Zeng, Qiuhao ;
Guan, Cuntai .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (03) :2238-2250