Hierarchical extreme puzzle learning machine-based emotion recognition using multimodal physiological signals

被引:16
作者
Pradhan, Anushka [1 ]
Srivastava, Subodh [2 ]
机构
[1] Natl Inst Technol Patna, Dept ECE, Patna 800005, Bihar, India
[2] Natl Inst Technol Patna, Dept ECE, Patna 800005, Bihar, India
关键词
Physiological signals; Emotion recognition; Filtering; Signal conversion; Ensemble; Optimal features; NETWORK;
D O I
10.1016/j.bspc.2023.104624
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Detection of exact emotions through multi-modal physiological signals provides relevant information for different processes. Numerous computational approaches have been presented for the precise analysis of emotion types. But due to some problems like ruined signal quality, increased time consumption, and the necessity of high storage space, classification accuracy's efficiency worsens. Hence, this research classified multi-modal physio-logical signals based on machine and deep learning (DL) models. The proposed work implements the Hierarchical Extreme Puzzle Learning Machine (HEPLM) approach to classify the actual output of embedded emotions. The proposed work comprises four steps: pre-processing, signal-to-image conversion, feature extraction, and classi-fication. Pre-processing is carried out using Savitzky-Golay smoothing filtering (SGF) for the removal of noise and to increase signal quality. Hybrid wavelet scattering with Synchro squeezing Wavelet Transform approach converts the signal into an image. In feature extraction process, the valuable features are extracted using ResNet-152 and the Inception v3 model, whereas the features are combined through an ensemble approach. HEPLM is used in the final classification process, combining Puzzle Optimization Algorithm (POA) and Hierarchical Extreme Learning Machine (HELM) to reduce feature dimensionality and improve classification accuracy. The dataset adopted in the proposed work is Wearable Stress and Affect Detection (WESAD) to collect multi-modal physiological signals. The presentation of the projected work is assessed with metrics like accuracy, recall, precision, F1 score, kappa, and so on. The proposed effort demonstrates better results of emotion classification when compared to the existing methods by holding 96.29% of accuracy.
引用
收藏
页数:15
相关论文
共 40 条
  • [1] A Deep-Learning Model for Subject-Independent Human Emotion Recognition Using Electrodermal Activity Sensors
    Al Machot, Fadi
    Elmachot, Ali
    Ali, Mouhannad
    Al Machot, Elyan
    Kyamakya, Kyandoghere
    [J]. SENSORS, 2019, 19 (07)
  • [2] Alhagry S, 2017, INT J ADV COMPUT SC, V8, P355, DOI 10.14569/IJACSA.2017.081046
  • [3] Alshamrani M, ADV STRESS DETECTION
  • [4] LSTM-Based Emotion Detection Using Physiological Signals: IoT Framework for Healthcare and Distance Learning in COVID-19
    Awais, Muhammad
    Raza, Mohsin
    Singh, Nishant
    Bashir, Kiran
    Manzoor, Umar
    Ul Islam, Saif
    Rodrigues, Joel J. P. C.
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (23) : 16863 - 16871
  • [5] Emotion Recognition from Multimodal Physiological Signals for Emotion Aware Healthcare Systems
    Ayata, Deger
    Yaslan, Yusuf
    Kamasak, Mustafa E.
    [J]. JOURNAL OF MEDICAL AND BIOLOGICAL ENGINEERING, 2020, 40 (02) : 149 - 157
  • [6] Diabetic retinopathy severity grading employing quadrant-basedInception-V3convolution neural network architecture
    Bhardwaj, Charu
    Jain, Shruti
    Sood, Meenakshi
    [J]. INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2021, 31 (02) : 592 - 608
  • [7] Bhatti A, 2022, Arxiv, DOI arXiv:2206.04625
  • [8] Bobade Pramod, 2020, Proceedings of Second International Conference on Inventive Research in Computing Applications (ICIRCA 2020), P51, DOI 10.1109/ICIRCA48905.2020.9183244
  • [9] A Hybrid Approach for Time-Varying Harmonic and Interharmonic Detection Using Synchrosqueezing Wavelet Transform
    Chang, Gary W.
    Lin, Yu-Luh
    Liu, Yu-Jen
    Sun, Gary H.
    Yu, Johnson T.
    [J]. APPLIED SCIENCES-BASEL, 2021, 11 (02): : 1 - 19
  • [10] Chen J, 2015, IEEE INT C BIOINFORM, P395, DOI 10.1109/BIBM.2015.7359713