Hybrid deep models for parallel feature extraction and enhanced emotion state classification

被引:2
作者
Pichandi, Sivasankaran [1 ]
Balasubramanian, Gomathy [2 ]
Chakrapani, Venkatesh [3 ]
机构
[1] Sengunthar Engn Coll, Elect & Commun Engn, Tiruchengode, Tamil Nadu, India
[2] PSG Inst Technol & Appl Res, Comp Sci & Engn, Coimbatore, Tamilnadu, India
[3] Builders Engn Coll, Elect & Commun Engn, Kangeyam, Tamilnadu, India
来源
SCIENTIFIC REPORTS | 2024年 / 14卷 / 01期
关键词
Emotion analysis; Deep learning; AlexNet; DenseNet; Support vector machine; Principal component analysis; NEURAL-NETWORK; RECOGNITION; EEG; DATABASE; SIGNALS;
D O I
10.1038/s41598-024-75850-y
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Emotions play a vital role in recognizing a person's thoughts and vary significantly with stress levels. Emotion and stress classification have gained considerable attention in robotics and artificial intelligence applications. While numerous methods based on machine learning techniques provide average classification performance, recent deep learning approaches offer enhanced results. This research presents a hybrid deep learning model that extracts features using AlexNet and DenseNet models, followed by feature fusion and dimensionality reduction via Principal Component Analysis (PCA). The reduced features are then classified using a multi-class Support Vector Machine (SVM) to categorize different types of emotions. The proposed model was evaluated using the DEAP and EEG Brainwave datasets, both well-suited for emotion analysis due to their comprehensive EEG signal recordings and diverse emotional stimuli. The DEAP dataset includes EEG signals from 32 participants who watched 40 one-minute music videos, while the EEG Brainwave dataset categorizes emotions into positive, negative, and neutral based on EEG recordings from participants exposed to six different film clips. The proposed model achieved an accuracy of 95.54% and 97.26% for valence and arousal categories in the DEAP dataset, respectively, and 98.42% for the EEG Brainwave dataset. These results significantly outperform existing methods, demonstrating the model's superior performance in terms of precision, recall, F1-score, specificity, and Mathew correlation coefficient. The integration of AlexNet and DenseNet, combined with PCA and multi-class SVM, makes this approach particularly effective for capturing the intricate patterns in EEG data, highlighting its potential for applications in human-computer interaction and mental health monitoring, marking a significant advancement over traditional methods.
引用
收藏
页数:17
相关论文
共 46 条
  • [1] Domain Adversarial for Acoustic Emotion Recognition
    Abdelwahab, Mohammed
    Busso, Carlos
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2018, 26 (12) : 2423 - 2435
  • [2] 3-D Convolutional Recurrent Neural Networks With Attention Model for Speech Emotion Recognition
    Chen, Mingyi
    He, Xuanji
    Yang, Jing
    Zhang, Han
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2018, 25 (10) : 1440 - 1444
  • [3] Emotion Recognition from EEG Signals Using Recurrent Neural Networks
    Chowdary, M. Kalpana
    Anitha, J.
    Hemanth, D. Jude
    [J]. ELECTRONICS, 2022, 11 (15)
  • [4] Contextualized Multidimensional Personality Recognition using Combination of Deep Neural Network and Ensemble Learning
    Deilami, Fatemeh Mohades
    Sadr, Hossein
    Tarkhan, Morteza
    [J]. NEURAL PROCESSING LETTERS, 2022, 54 (05) : 3811 - 3828
  • [5] Multi-Label Emotion Detection via Emotion-Specified Feature Extraction and Emotion Correlation Learning
    Deng, Jiawen
    Ren, Fuji
    [J]. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (01) : 475 - 486
  • [6] Dhingra R.C., 2021, Eur. J. Mol. Clin. Med., P3509
  • [7] eecs.qmul.ac.uk, About Us
  • [8] A Novel Approach for Classification of Speech Emotions Based on Deep and Acoustic Features
    Er, Mehmet Bilal
    [J]. IEEE ACCESS, 2020, 8 : 221640 - 221653
  • [9] Automatic ECG-Based Emotion Recognition in Music Listening
    Hsu, Yu-Liang
    Wang, Jeen-Shing
    Chiang, Wei-Chun
    Hung, Chien-Han
    [J]. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2020, 11 (01) : 85 - 99
  • [10] Jadidinejad AH., 2015, INDIAN J SCI TECHNOL, V8, P495, DOI 10.17485/ijst/2015/v8i15/46754