A novel signal channel attention network for multi-modal emotion recognition

被引:1
|
作者
Du, Ziang [1 ]
Ye, Xia [1 ]
Zhao, Pujie [1 ]
机构
[1] Xian Res Inst High Tech, Xian, Shaanxi, Peoples R China
来源
FRONTIERS IN NEUROROBOTICS | 2024年 / 18卷
关键词
hypercomplex neural networks; physiological signals; attention fusion module; multi-modal fusion; emotion recognition;
D O I
10.3389/fnbot.2024.1442080
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Physiological signal recognition is crucial in emotion recognition, and recent advancements in multi-modal fusion have enabled the integration of various physiological signals for improved recognition tasks. However, current models for emotion recognition with hyper complex multi-modal signals face limitations due to fusion methods and insufficient attention mechanisms, preventing further enhancement in classification performance. To address these challenges, we propose a new model framework named Signal Channel Attention Network (SCA-Net), which comprises three main components: an encoder, an attention fusion module, and a decoder. In the attention fusion module, we developed five types of attention mechanisms inspired by existing research and performed comparative experiments using the public dataset MAHNOB-HCI. All of these experiments demonstrate the effectiveness of the attention module we addressed for our baseline model in improving both accuracy and F1 score metrics. We also conducted ablation experiments within the most effective attention fusion module to verify the benefits of multi-modal fusion. Additionally, we adjusted the training process for different attention fusion modules by employing varying early stopping parameters to prevent model overfitting.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Multi-modal Emotion Recognition for Determining Employee Satisfaction
    Zaman, Farhan Uz
    Zaman, Maisha Tasnia
    Alam, Md Ashraful
    Alam, Md Golam Rabiul
    2021 IEEE ASIA-PACIFIC CONFERENCE ON COMPUTER SCIENCE AND DATA ENGINEERING (CSDE), 2021,
  • [22] Emotion recognition based on multi-modal physiological signals and transfer learning
    Fu, Zhongzheng
    Zhang, Boning
    He, Xinrun
    Li, Yixuan
    Wang, Haoyuan
    Huang, Jian
    FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [23] A Deep GRU-BiLSTM Network for Multi-modal Emotion Recognition from Text
    Yacoubi, Ibtissem
    Ferjaoui, Radhia
    Ben Khalifa, Anouar
    2024 IEEE 7TH INTERNATIONAL CONFERENCE ON ADVANCED TECHNOLOGIES, SIGNAL AND IMAGE PROCESSING, ATSIP 2024, 2024, : 138 - 143
  • [24] SMIN: Semi-Supervised Multi-Modal Interaction Network for Conversational Emotion Recognition
    Lian, Zheng
    Liu, Bin
    Tao, Jianhua
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (03) : 2415 - 2429
  • [25] Multi-modal Conditional Attention Fusion for Dimensional Emotion Prediction
    Chen, Shizhe
    Jin, Qin
    MM'16: PROCEEDINGS OF THE 2016 ACM MULTIMEDIA CONFERENCE, 2016, : 571 - 575
  • [26] A joint hierarchical cross-attention graph convolutional network for multi-modal facial expression recognition
    Xu, Chujie
    Du, Yong
    Wang, Jingzi
    Zheng, Wenjie
    Li, Tiejun
    Yuan, Zhansheng
    COMPUTATIONAL INTELLIGENCE, 2024, 40 (01)
  • [27] MIA-Net: Multi-Modal Interactive Attention Network for Multi-Modal Affective Analysis
    Li, Shuzhen
    Zhang, Tong
    Chen, Bianna
    Chen, C. L. Philip
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (04) : 2796 - 2809
  • [28] Contextual and Cross-Modal Interaction for Multi-Modal Speech Emotion Recognition
    Yang, Dingkang
    Huang, Shuai
    Liu, Yang
    Zhang, Lihua
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 2093 - 2097
  • [29] A multi-modal deep learning system for Arabic emotion recognition
    Abu Shaqra F.
    Duwairi R.
    Al-Ayyoub M.
    International Journal of Speech Technology, 2023, 26 (01) : 123 - 139
  • [30] M3GAT: A Multi-modal, Multi-task Interactive Graph Attention Network for Conversational Sentiment Analysis and Emotion Recognition
    Zhang, Yazhou
    Jia, Ao
    Wang, Bo
    Zhang, Peng
    Zhao, Dongming
    Li, Pu
    Hou, Yuexian
    Jin, Xiaojia
    Song, Dawei
    Qin, Jing
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2024, 42 (01)