Emotion Recognition from Spatio-Temporal Representation of EEG Signals via 3D-CNN with Ensemble Learning Techniques

被引:21
作者
Yuvaraj, Rajamanickam [1 ]
Baranwal, Arapan [2 ]
Prince, A. Amalin [3 ]
Murugappan, M. [4 ,5 ,6 ]
Mohammed, Javeed Shaikh [7 ]
机构
[1] Nanyang Technol Univ, Natl Inst Educ, Singapore 637616, Singapore
[2] BITS Pilani, Dept Comp Sci & Informat Syst, Sancoale 403726, Goa, India
[3] BITS Pilani, Dept Elect & Elect Engn, Sancoale 403726, Goa, India
[4] Kuwait Coll Sci & Technol, Dept Elect & Commun Engn, Intelligent Signal Proc ISP Res Lab, Block 4, Doha 13133, Kuwait
[5] Vels Inst Sci Technol & Adv Studies, Fac Engn, Dept Elect & Commun Engn, Chennai 600117, Tamilnadu, India
[6] Univ Malaysia Perlis, Ctr Excellence Unmanned Aerial Syst CoEUAS, Kangar 02600, Perlis, Malaysia
[7] Prince Sattam bin Abdulaziz Univ, Coll Appl Med Sci, Dept Biomed Technol, Al Kharj 11942, Saudi Arabia
关键词
hybrid models; 3D-CNN; deep neural networks; machine learning classifiers; emotion recognition;
D O I
10.3390/brainsci13040685
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
The recognition of emotions is one of the most challenging issues in human-computer interaction (HCI). EEG signals are widely adopted as a method for recognizing emotions because of their ease of acquisition, mobility, and convenience. Deep neural networks (DNN) have provided excellent results in emotion recognition studies. Most studies, however, use other methods to extract handcrafted features, such as Pearson correlation coefficient (PCC), Principal Component Analysis, Higuchi Fractal Dimension (HFD), etc., even though DNN is capable of generating meaningful features. Furthermore, most earlier studies largely ignored spatial information between the different channels, focusing mainly on time domain and frequency domain representations. This study utilizes a pre-trained 3D-CNN MobileNet model with transfer learning on the spatio-temporal representation of EEG signals to extract features for emotion recognition. In addition to fully connected layers, hybrid models were explored using other decision layers such as multilayer perceptron (MLP), k-nearest neighbor (KNN), extreme learning machine (ELM), XGBoost (XGB), random forest (RF), and support vector machine (SVM). Additionally, this study investigates the effects of post-processing or filtering output labels. Extensive experiments were conducted on the SJTU Emotion EEG Dataset (SEED) (three classes) and SEED-IV (four classes) datasets, and the results obtained were comparable to the state-of-the-art. Based on the conventional 3D-CNN with ELM classifier, SEED and SEED-IV datasets showed a maximum accuracy of 89.18% and 81.60%, respectively. Post-filtering improved the emotional classification performance in the hybrid 3D-CNN with ELM model for SEED and SEED-IV datasets to 90.85% and 83.71%, respectively. Accordingly, spatial-temporal features extracted from the EEG, along with ensemble classifiers, were found to be the most effective in recognizing emotions compared to state-of-the-art methods.
引用
收藏
页数:17
相关论文
共 50 条
[21]   Feature hypergraph representation learning on spatial-temporal correlations for EEG emotion recognition [J].
Li, Menghang ;
Qiu, Min ;
Zhu, Li ;
Kong, Wanzeng .
COGNITIVE NEURODYNAMICS, 2023, 17 (05) :1271-1281
[22]   Feature hypergraph representation learning on spatial-temporal correlations for EEG emotion recognition [J].
Menghang Li ;
Min Qiu ;
Li Zhu ;
Wanzeng Kong .
Cognitive Neurodynamics, 2023, 17 :1271-1281
[23]   Emotion recognition from EEG signals using machine learning model [J].
Akshay, K. R. ;
Sundar, Sumod ;
Shanir, Muhammed P. P. .
2022 5TH INTERNATIONAL CONFERENCE ON MULTIMEDIA, SIGNAL PROCESSING AND COMMUNICATION TECHNOLOGIES (IMPACT), 2022,
[24]   EEG-CNN-Souping: Interpretable emotion recognition from EEG signals using EEG-CNN-souping model and explainable AI [J].
Chaudary, Eamin ;
Khan, Sheeraz Ahmad ;
Mumtaz, Wajid .
COMPUTERS & ELECTRICAL ENGINEERING, 2025, 123
[25]   Spatio-Temporal Representation of an Electoencephalogram for Emotion Recognition Using a Three-Dimensional Convolutional Neural Network [J].
Cho, Jungchan ;
Hwang, Hyoseok .
SENSORS, 2020, 20 (12) :1-18
[26]   Attention-Based Temporal Graph Representation Learning for EEG-Based Emotion Recognition [J].
Li, Chao ;
Wang, Feng ;
Zhao, Ziping ;
Wang, Haishuai ;
Schuller, Bjorn W. .
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2024, 28 (10) :5755-5767
[27]   Emotion Recognition from EEG Signals Using Advanced Transformations and Deep Learning [J].
Cruz-Vazquez, Jonathan Axel ;
Montiel-Perez, Jesus Yalja ;
Romero-Herrera, Rodolfo ;
Rubio-Espino, Elsa .
MATHEMATICS, 2025, 13 (02)
[28]   Learning CNN features from DE features for EEG-based emotion recognition [J].
Sunhee Hwang ;
Kibeom Hong ;
Guiyoung Son ;
Hyeran Byun .
Pattern Analysis and Applications, 2020, 23 :1323-1335
[29]   Learning CNN features from DE features for EEG-based emotion recognition [J].
Hwang, Sunhee ;
Hong, Kibeom ;
Son, Guiyoung ;
Byun, Hyeran .
PATTERN ANALYSIS AND APPLICATIONS, 2020, 23 (03) :1323-1335
[30]   A comparative study of time-frequency features based spatio-temporal analysis with varying multiscale kernels for emotion recognition from EEG [J].
Khan, Md Raihan ;
Tania, Airin Akter ;
Ahmad, Mohiuddin .
BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2025, 107