EEG emotion recognition approach using multi-scale convolution and feature fusion

被引:0
|
作者
Zhang, Yong [1 ]
Shan, Qingguo [1 ]
Chen, Wenyun [1 ]
Liu, Wenzhe [1 ]
机构
[1] Huzhou Univ, Sch Informat Engn, East 2nd Rd, Huzhou 313000, Zhejiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Attention mechanism; Electroencephalogram; Emotion recognition; Long short-term memory network; Multi-scale convolutional neural network; NEURAL-NETWORKS;
D O I
10.1007/s00371-024-03652-4
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Electroencephalogram (EEG) signal has been widely applied in emotion recognition due to its objectivity and reflection of an individual's actual emotional state. However, current EEG-based emotion recognition methods still suffer from limitations such as single-feature extraction, missing local features, and low feature extraction rates, all of which affect emotion recognition accuracy. To address these challenges, we propose an EEG emotion recognition approach based on multi-scale convolution and feature fusion, aimed at integrating low-dimensional and high-dimensional EEG features to enhance classification performance. The proposed approach first extracts multi-scale features of EEG using a multi-scale convolutional neural network. Subsequently, it employs the convolutional block attention module to further refine the obtained multi-scale features, highlighting emotion-related high-quality features while suppressing unimportant features. Finally, emotion-related time series are extracted by long short-term memory (LSTM) networks to achieve time-space-frequency feature fusion of EEG. We conducted fivefold cross-validation on the DEAP and SEED datasets to demonstrate the effectiveness of the proposed approach. For the DEAP dataset, the proposed approach achieved average accuracies of 97.48% and 97.75% for valence and arousal dimensions, respectively, and 96.98% for the SEED dataset, outperforming other models in terms of recognition accuracy.
引用
收藏
页码:4157 / 4169
页数:13
相关论文
共 50 条
  • [41] Multi-analysis domain feature fusion of EEG emotion recognition based on integrated deep learning model
    Chao H.
    Liu Y.-L.
    Lian W.-F.
    Kongzhi yu Juece/Control and Decision, 2020, 35 (07): : 1674 - 1680
  • [42] Unconscious Emotion Recognition based on Multi-scale Sample Entropy
    Shi, Yanjing
    Zheng, Xiangwei
    Li, Tiantian
    PROCEEDINGS 2018 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2018, : 1221 - 1226
  • [43] Multi-view Self-supervised Learning and Multi-scale Feature Fusion for Automatic Speech Recognition
    Zhao, Jingyu
    Li, Ruwei
    Tian, Maocun
    An, Weidong
    NEURAL PROCESSING LETTERS, 2024, 56 (04)
  • [44] EEG-Based Multi-Modal Emotion Recognition using Bag of Deep Features: An Optimal Feature Selection Approach
    Asghar, Muhammad Adeel
    Khan, Muhammad Jamil
    Fawad
    Amin, Yasar
    Rizwan, Muhammad
    Rahman, MuhibUr
    Badnava, Salman
    Mirjavadi, Seyed Sajad
    SENSORS, 2019, 19 (23)
  • [45] CATM: A Multi-Feature-Based Cross-Scale Attentional Convolutional EEG Emotion Recognition Model
    Yu, Hongde
    Xiong, Xin
    Zhou, Jianhua
    Qian, Ren
    Sha, Kaiwen
    SENSORS, 2024, 24 (15)
  • [46] MFANet: Multi-scale feature fusion network with attention mechanism
    Gaihua Wang
    Xin Gan
    Qingcheng Cao
    Qianyu Zhai
    The Visual Computer, 2023, 39 : 2969 - 2980
  • [47] MFANet: Multi-scale feature fusion network with attention mechanism
    Wang, Gaihua
    Gan, Xin
    Cao, Qingcheng
    Zhai, Qianyu
    VISUAL COMPUTER, 2023, 39 (07) : 2969 - 2980
  • [48] Multi-Scale Feature Interactive Fusion Network for RGBT Tracking
    Xiao, Xianbing
    Xiong, Xingzhong
    Meng, Fanqin
    Chen, Zhen
    SENSORS, 2023, 23 (07)
  • [49] An autoencoder-based feature level fusion for speech emotion recognition
    Peng, Shixin
    Kai, Chen
    Tian, Tian
    Chen, Jingying
    DIGITAL COMMUNICATIONS AND NETWORKS, 2024, 10 (05) : 1341 - 1351
  • [50] An EEG Data Processing Approach for Emotion Recognition
    Li, Guofa
    Ouyang, Delin
    Yuan, Yufei
    Li, Wenbo
    Guo, Zizheng
    Qu, Xingda
    Green, Paul
    IEEE SENSORS JOURNAL, 2022, 22 (11) : 10751 - 10763