Switch fusion for continuous emotion estimation from multiple physiological signals

被引:0
作者
Vu, Ngoc Tu [1 ]
Huynh, Van Thong [3 ]
Kim, Seung-Won [1 ]
Shin, Ji-eun [2 ]
Yang, Hyung-Jeong [1 ]
Kim, Soo-Hyung [1 ]
机构
[1] Chonnam Natl Univ, Dept AI Convergence, Gwangju 61186, South Korea
[2] Chonnam Natl Univ, Dept Psychol, Gwangju 61186, South Korea
[3] FPT Univ, Dept ITS, HoChiMinh City 71216, Vietnam
基金
新加坡国家研究基金会;
关键词
Continuous emotion estimation; Multimodal dynamic fusion; Physiological signals; Affective computing; FACIAL EXPRESSION; RECOGNITION;
D O I
10.1016/j.bspc.2025.107831
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Physiological signals represent a robust foundation for affective computing, primarily due to their resistance to conscious manipulation by subjects. With the proliferation of applications such as safe driving, mental health treatment, and wearable wellness technologies, emotion recognition based on physiological signals has garnered substantial attention. However, the increasing variety of signals captured by diverse sensors poses a challenge for models to integrate these inputs and accurately predict emotional states efficiently. Determining an optimized fusion strategy becomes increasingly complex as the number of signals grows. To address this, we propose switch fusion, a dynamic allocation fusion algorithm designed to dynamically enable models to learn optimal fusion strategies of multiple modalities. Leveraging the mixture of experts' frameworks, our approach employs a gating mechanism to route modalities to specialized experts, utilizing these experts as fusion encoder modules. Furthermore, we demonstrate the effectiveness of time series-based models in processing physiological signals for continuous emotion estimation to enhance computational efficiency. Experiments conducted on the continuously annotated signals of emotion dataset highlight the effectiveness of switch fusion, achieving root mean square errors of 1.064 and 1.089 for arousal and valence scores, respectively, surpassing stateof-the-art methods in 3 out of 4 experimental scenarios. This study underscores the critical role of dynamic fusion strategies in continuous emotion estimation from diverse physiological signals, effectively addressing the challenges posed by the increasing complexity of sensor inputs.
引用
收藏
页数:13
相关论文
共 70 条
  • [1] A Survey on Physiological Signal-Based Emotion Recognition
    Ahmad, Zeeshan
    Khan, Naimul
    [J]. BIOENGINEERING-BASEL, 2022, 9 (11):
  • [2] CNN Based Subject-Independent Driver Emotion Recognition System Involving Physiological Signals for ADAS
    Ali, Mouhannad
    Al Machot, Fadi
    Mosa, Ahmad Haj
    Kyamakya, Kyandoghere
    [J]. ADVANCED MICROSYSTEMS FOR AUTOMOTIVE APPLICATIONS 2016: SMART SYSTEMS FOR THE AUTOMOBILE OF THE FUTURE, 2016, : 125 - 138
  • [3] A survey of state-of-the-art approaches for emotion recognition in text
    Alswaidan, Nourah
    Menai, Mohamed El Bachir
    [J]. KNOWLEDGE AND INFORMATION SYSTEMS, 2020, 62 (08) : 2937 - 2987
  • [4] A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
    Alzubaidi, Laith
    Bai, Jinshuai
    Al-Sabaawi, Aiman
    Santamaria, Jose
    Albahri, A. S.
    Al-dabbagh, Bashar Sami Nayyef
    Fadhel, Mohammed. A. A.
    Manoufali, Mohamed
    Zhang, Jinglan
    Al-Timemy, Ali. H. H.
    Duan, Ye
    Abdullah, Amjed
    Farhan, Laith
    Lu, Yi
    Gupta, Ashish
    Albu, Felix
    Abbosh, Amin
    Gu, Yuantong
    [J]. JOURNAL OF BIG DATA, 2023, 10 (01)
  • [5] Multimodal Machine Learning: A Survey and Taxonomy
    Baltrusaitis, Tadas
    Ahuja, Chaitanya
    Morency, Louis-Philippe
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2019, 41 (02) : 423 - 443
  • [6] Couprie C, 2013, Arxiv, DOI arXiv:1301.3572
  • [7] Affective Computing as a Tool for Understanding Emotion Dynamics from Physiology: A Predictive Modeling Study of Arousal and Valence
    D'Amelio, Tomas A.
    Bruno, Nicolas M.
    Bugnon, Leandro A.
    Zamberlan, Federico
    Tagliazucchi, Enzo
    [J]. 2023 11TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS, ACIIW, 2023,
  • [8] Use of Wearable Devices in Individuals With or at Risk for Cardiovascular Disease in the US, 2019 to 2020
    Dhingra, Lovedeep S.
    Aminorroaya, Arya
    Oikonomou, Evangelos K.
    Nargesi, Arash Aghajani
    Wilson, Francis Perry
    Krumholz, Harlan M.
    Khera, Rohan
    [J]. JAMA NETWORK OPEN, 2023, 6 (06)
  • [9] SigRep: Toward Robust Wearable Emotion Recognition With Contrastive Representation Learning
    Dissanayake, Vipula
    Seneviratne, Sachith
    Rana, Rajib
    Wen, Elliott
    Kaluarachchi, Tharindu
    Nanayakkara, Suranga
    [J]. IEEE ACCESS, 2022, 10 : 18105 - 18120
  • [10] Ensemble Learning to Assess Dynamics of Affective Experience Ratings and Physiological Change
    Dollack, Felix
    Kiyokawa, Kiyoshi
    Liu, Huakun
    Perusquia-Hernandez, Monica
    Raman, Chirag
    Uchiyama, Hideaki
    Wei, Xin
    [J]. 2023 11TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS, ACIIW, 2023,