Emotion recognition based on multi-modal physiological signals and transfer learning

被引:20
作者
Fu, Zhongzheng [1 ]
Zhang, Boning [1 ]
He, Xinrun [1 ]
Li, Yixuan [1 ]
Wang, Haoyuan [1 ]
Huang, Jian [1 ]
机构
[1] Huazhong Univ Sci & Technol, Sch Artificial Intelligence & Automation, Wuhan, Peoples R China
关键词
emotion recognition; transfer learning; domain adaptation; physiological signal; multimodal fusion; individual difference;
D O I
10.3389/fnins.2022.1000716
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
In emotion recognition based on physiological signals, collecting enough labeled data of a single subject for training is time-consuming and expensive. The physiological signals' individual differences and the inherent noise will significantly affect emotion recognition accuracy. To overcome the difference in subject physiological signals, we propose a joint probability domain adaptation with the bi-projection matrix algorithm (JPDA-BPM). The bi-projection matrix method fully considers the source and target domain's different feature distributions. It can better project the source and target domains into the feature space, thereby increasing the algorithm's performance. We propose a substructure-based joint probability domain adaptation algorithm (SSJPDA) to overcome physiological signals' noise effect. This method can avoid the shortcomings that the domain level matching is too rough and the sample level matching is susceptible to noise. In order to verify the effectiveness of the proposed transfer learning algorithm in emotion recognition based on physiological signals, we verified it on the database for emotion analysis using physiological signals (DEAP dataset). The experimental results show that the average recognition accuracy of the proposed SSJPDA-BPM algorithm in the multimodal fusion physiological data from the DEAP dataset is 63.6 and 64.4% in valence and arousal, respectively. Compared with joint probability domain adaptation (JPDA), the performance of valence and arousal recognition accuracy increased by 17.6 and 13.4%, respectively.
引用
收藏
页数:15
相关论文
共 53 条
[1]  
Abdulsalam Wisal Hashim, 2019, International Journal of Machine Learning and Computing, V9, P14, DOI 10.18178/ijmlc.2019.9.1.759
[2]   Recognizing Emotion from Speech Based on Age and Gender Using Hierarchical Models [J].
Abu Shaqra, Ftoon ;
Duwairi, Rehab ;
Al-Ayyoub, Mahmoud .
10TH INTERNATIONAL CONFERENCE ON AMBIENT SYSTEMS, NETWORKS AND TECHNOLOGIES (ANT 2019) / THE 2ND INTERNATIONAL CONFERENCE ON EMERGING DATA AND INDUSTRY 4.0 (EDI40 2019) / AFFILIATED WORKSHOPS, 2019, 151 :37-44
[3]   Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review [J].
Al-Nafjan, Abeer ;
Hosny, Manar ;
Al-Ohali, Yousef ;
Al-Wabil, Areej .
APPLIED SCIENCES-BASEL, 2017, 7 (12)
[4]   A Fast, Efficient Domain Adaptation Technique for Cross-Domain Electroencephalography(EEG)-Based Emotion Recognition [J].
Chai, Xin ;
Wang, Qisong ;
Zhao, Yongping ;
Li, Yongqiang ;
Liu, Dan ;
Liu, Xin ;
Bai, Ou .
SENSORS, 2017, 17 (05)
[5]  
Courty N., 2017, PREPRINTS, DOI [10.14288/1.0357417, DOI 10.14288/1.0357417]
[6]   Optimal Transport for Domain Adaptation [J].
Courty, Nicolas ;
Flamary, Remi ;
Tuia, Devis ;
Rakotomamonjy, Alain .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (09) :1853-1865
[7]  
Das D, 2018, IEEE IMAGE PROC, P3758, DOI 10.1109/ICIP.2018.8451152
[8]   A Convolution Bidirectional Long Short-Term Memory Neural Network for Driver Emotion Recognition [J].
Du, Guanglong ;
Wang, Zhiyao ;
Gao, Boyu ;
Mumtaz, Shahid ;
Abualnaja, Khamael M. ;
Du, Cuifeng .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2021, 22 (07) :4570-4578
[9]   Personalized Human Activity Recognition Based on Integrated Wearable Sensor and Transfer Learning [J].
Fu, Zhongzheng ;
He, Xinrun ;
Wang, Enkai ;
Huo, Jun ;
Huang, Jian ;
Wu, Dongrui .
SENSORS, 2021, 21 (03) :1-23
[10]   An Emotion Recognition System Based on Physiological Signals Obtained by Wearable Sensors [J].
He, Cheng ;
Yao, Yun-jin ;
Ye, Xue-song .
WEARABLE SENSORS AND ROBOTS, 2017, 399 :15-25