Investigating the brain network characteristics of multimodal emotion recognition and its classification applications based on functional connectivity patterns

被引:1
作者
Gu, Jin [1 ,2 ]
Luo, Xiaoqi [1 ]
Gong, Xinhao [1 ]
Su, Chenxu [1 ]
机构
[1] Southwest Jiaotong Univ, Sch Comp & Artificial Intelligence, Chengdu, Peoples R China
[2] Mfg Ind Chains Collaborat, Informat Support Technol Key Lab Sichuan Prov, Chengdu, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Multimodal emotion recognition; Emotion valence; Functional connectivity; Brain signal classification; fMRI; REPRESENTATIONS; SPEECH;
D O I
10.1016/j.bspc.2024.106635
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Emotion information can be expressed in multiple modal stimuli, and the brain can recognize multi-modal emotions efficiently and accurately. Recent researches have thoroughly analyzed the activity characteristics of relevant brain regions under different modal emotion information based on functional magnetic resonance imaging (fMRI). However, considering the functional integration characteristics of the brain in cognitive activities, further research on the brain network in the process of multi-modal emotion recognition should be carried out, which can reveal the brain's multi-modal emotion cognition mechanism more comprehensively. In this study, functional connectivity (FC) analysis was performed on the fMRI data from multimodal emotion recognition tasks. The correlation coefficients of brain regions were calculated and statistically analyzed to study the characteristics of FC patterns in multimodal emotion recognition processing. Moreover, the emotional information was decoded with different machine learning classification algorithms based on the FC patterns. The results showed that the modal and valence of emotion can lead to structural similarities and connection strength differences in connection strength of brain region connections in the brain network, and this property can successfully support the decoding of emotional information, and the decoding accuracy is higher than the previous decoding accuracy based on brain region activation patterns. This study explores the cognitive mechanism of multi-modal emotion recognition from a new perspective of brain functional integration, and compensates for the lack of brain signal decoding methods based on FC features and machine learning methods.
引用
收藏
页数:14
相关论文
共 33 条
[21]   Single-cell activity and network properties of dorsal raphe nucleus serotonin neurons during emotionally salient behaviors [J].
Paquelet, Grace E. ;
Carrion, Kassandra ;
Lacefield, Clay O. ;
Zhou, Pengcheng ;
Hen, Rene ;
Miller, Bradley R. .
NEURON, 2022, 110 (16) :2664-+
[22]   Supramodal Representations of Perceived Emotions in the Human Brain [J].
Peelen, Marius V. ;
Atkinson, Anthony P. ;
Vuilleumier, Patrik .
JOURNAL OF NEUROSCIENCE, 2010, 30 (30) :10127-10134
[23]   Discrete Neural Signatures of Basic Emotions [J].
Saarimaki, Heini ;
Gotsopoulos, Athanasios ;
Jaaskelainen, Iiro P. ;
Lampinen, Jouko ;
Vuilleumier, Patrik ;
Hari, Riitta ;
Sams, Mikko ;
Nummenmaa, Lauri .
CEREBRAL CORTEX, 2016, 26 (06) :2563-2573
[24]   MMTrans-MT: A Framework for Multimodal Emotion Recognition Using Multitask Learning [J].
Shen, Jinrui ;
Zheng, Jiahao ;
Wang, Xiaoping .
2021 13TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2021, :52-59
[25]   Emotion Recognition under Sleep Deprivation Using a Multimodal Residual LSTM Network [J].
Tao, Le-Yan ;
Lu, Bao-Liang .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
[26]   Investigating EEG-based functional connectivity patterns for multimodal emotion recognition [J].
Wu, Xun ;
Zheng, Wei-Long ;
Li, Ziyi ;
Lu, Bao-Liang .
JOURNAL OF NEURAL ENGINEERING, 2022, 19 (01)
[27]   Multi-Hypergraph Learning-Based Brain Functional Connectivity Analysis in fMRI Data [J].
Xiao, Li ;
Wang, Junqi ;
Kassani, Peyman H. ;
Zhang, Yipu ;
Bai, Yuntong ;
Stephen, Julia M. ;
Wilson, Tony W. ;
Calhoun, Vince D. ;
Wang, Yu-Ping .
IEEE TRANSACTIONS ON MEDICAL IMAGING, 2020, 39 (05) :1746-1758
[28]   Robust Multimodal Emotion Recognition from Conversation with Transformer-Based Crossmodality Fusion [J].
Xie, Baijun ;
Sidulova, Mariia ;
Park, Chung Hyuk .
SENSORS, 2021, 21 (14)
[29]   Weighted RSA: An Improved Framework on the Perception of Audio-visual Affective Speech in Left Insula and Superior Temporal Gyrus [J].
Xu, Junhai ;
Dong, Haibin ;
Li, Na ;
Wang, Zeyu ;
Guo, Fei ;
Wei, Jianguo ;
Dang, Jianwu .
NEUROSCIENCE, 2021, 469 :46-58
[30]   Reduced default mode network functional connectivity in patients with recurrent major depressive disorder [J].
Yan, Chao-Gan ;
Chen, Xiao ;
Li, Le ;
Castellanos, Francisco Xavier ;
Bai, Tong-Jian ;
Bo, Qi-Jing ;
Cao, Jun ;
Chen, Guan-Mao ;
Chen, Ning-Xuan ;
Chen, Wei ;
Cheng, Chang ;
Cheng, Yu-Qi ;
Cui, Xi-Long ;
Duan, Jia ;
Fang, Yi-Ru ;
Gong, Qi-Yong ;
Guo, Wen-Bin ;
Hou, Zheng-Hua ;
Hu, Lan ;
Kuang, Li ;
Li, Feng ;
Li, Kai-Ming ;
Li, Tao ;
Liu, Yan-Song ;
Liu, Zhe-Ning ;
Long, Yi-Cheng ;
Luo, Qing-Hua ;
Meng, Hua-Qing ;
Peng, Dai-Hui ;
Qiu, Hai-Tang ;
Qiu, Jiang ;
Shen, Yue-Di ;
Shi, Yu-Shu ;
Wang, Chuan-Yue ;
Wang, Fei ;
Wang, Kai ;
Wang, Li ;
Wang, Xiang ;
Wang, Ying ;
Wu, Xiao-Ping ;
Wu, Xin-Ran ;
Xie, Chun-Ming ;
Xie, Guang-Rong ;
Xie, Hai-Yan ;
Xie, Peng ;
Xu, Xiu-Feng ;
Yang, Hong ;
Yang, Jian ;
Yao, Jia-Shu ;
Yao, Shu-Qiao .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2019, 116 (18) :9078-9083