Emotion recognition based on multiple physiological signals

被引:25
作者
Li, Qi [1 ]
Liu, Yunqing [1 ]
Yan, Fei [1 ]
Zhang, Qiong [1 ]
Liu, Cong [2 ]
机构
[1] Changchun Univ Sci & Technol, Dept Elect & Informat Engn, Changchun, Peoples R China
[2] PLA Army Acad Artillery & Air Def, Zhengzhou, Peoples R China
关键词
EEG; Peripheral physiological signals; Multimodal; Emotion recognition; Feature extraction; DIFFERENTIAL ENTROPY FEATURE; EEG;
D O I
10.1016/j.bspc.2023.104989
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Physiological signals can more realistically reflect human emotional states. To overcome the limitations imposed in single-modal emotion recognition, emotion recognition of multimodal physiological signals has received increasingly widespread attention. However, the original fusion models usually ignore the different distributions of multiple signals and how to capture complementary features from multimodal information effectively. This paper proposes an effective classification model for multimodal physiological signals to address the above issues based on modeling the heterogeneity and correlation between multimodal signals. First, differential entropy features are extracted from Electroencephalography (EEG) signals and peripheral physiological signals (PPS) such as Electrocardiographic (ECG) signals, Electromyographic (EMG) signals, and other physiological signals. Then, according to the different distributions and frequency characteristics of the acquired signals, the EEG signal features are made into a three-dimensional feature map and input to the neural network to extract the frequency spatial dimension features. Further temporal features are extracted from the peripheral physiological signals using a long and short-term memory network. Finally, the EEG and peripheral physiological signal fea-tures were fused and input to a multimodal long and short-term memory network to extract the association between different modalities and perform classification. The experiments were conducted on the benchmark DEAP dataset, and the results showed that the classification accuracy of the proposed model in this paper was 95.89% and 94.99% in the arousal dimension and the valence dimension, respectively, which were 2.77% and 3.11% higher compared to the unimodal EEG model, respectively. This paper also analyzed the effects of different peripheral physiological signals on emotion recognition.
引用
收藏
页数:10
相关论文
共 30 条
[1]  
Chen T., 2020, URBAN INTELLIGENCE A, P125
[2]   EEG emotion recognition model based on the LIBSVM classifier [J].
Chen, Tian ;
Ju, Sihang ;
Ren, Fuji ;
Fan, Mingyan ;
Gu, Yu .
MEASUREMENT, 2020, 164 (164)
[3]  
Duan RN, 2013, I IEEE EMBS C NEUR E, P81, DOI 10.1109/NER.2013.6695876
[4]   Multiscale Lempel-Ziv complexity for EEG measures [J].
Ibanez-Molina, Antonio J. ;
Iglesias-Parro, Sergio ;
Soriano, Maria F. ;
Aznarte, Jose I. .
CLINICAL NEUROPHYSIOLOGY, 2015, 126 (03) :541-548
[5]   A snapshot research and implementation of multimodal information fusion for data-driven emotion recognition [J].
Jiang, Yingying ;
Li, Wei ;
Hossain, M. Shamim ;
Chen, Min ;
Alelaiwi, Abdulhameed ;
Al-Hammadi, Muneer .
INFORMATION FUSION, 2020, 53 :209-221
[6]   DEAP: A Database for Emotion Analysis Using Physiological Signals [J].
Koelstra, Sander ;
Muhl, Christian ;
Soleymani, Mohammad ;
Lee, Jong-Seok ;
Yazdani, Ashkan ;
Ebrahimi, Touradj ;
Pun, Thierry ;
Nijholt, Anton ;
Patras, Ioannis .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2012, 3 (01) :18-31
[7]   A new approach for emotions recognition through EOG and EMG signals [J].
Kose, Mangesh Ramaji ;
Ahirwal, Mitul Kumar ;
Kumar, Anil .
SIGNAL IMAGE AND VIDEO PROCESSING, 2021, 15 (08) :1863-1871
[8]   Deep Convolutional Neural Network for Emotion Recognition Using EEG and Peripheral Physiological Signal [J].
Lin, Wenqian ;
Li, Chao ;
Sun, Shouqian .
IMAGE AND GRAPHICS (ICIG 2017), PT II, 2017, 10667 :385-394
[9]   Facial and speech recognition emotion in distance education system [J].
Luo, Qi ;
Tan, Honghua .
2007 INTERNATIONAL CONFERENCE ON INTELLIGENT PERVASIVE COMPUTING, PROCEEDINGS, 2007, :483-486
[10]   Emotion Recognition using Multimodal Residual LSTM Network [J].
Ma, Jiaxin ;
Tang, Hao ;
Zheng, Wei-Long ;
Lu, Bao-Liang .
PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA (MM'19), 2019, :176-183