A two-stage multimodal emotion analysis using body actions and facial features

被引:0
作者
Tseng, Hsiao-Ting [1 ]
Hsieh, Chen-Chiung [2 ]
Xu, Cheng-Hong [2 ]
机构
[1] Natl Cent Univ, Dept Informat Management, 300 Zhongda Rd, Taoyuan City 320, Taiwan
[2] Tatung Univ, Dept Comp Sci & Engn, 40,Sect 3,Jhongshan N Rd, Taipei City 104, Taiwan
关键词
Deep learning; Emotion analysis; Action recognition; LSTM; Action transformer; RECOGNITION; POSTURE;
D O I
10.1007/s11760-025-03891-5
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This study uses a combination of body action detection and facial expression identification to analyze emotions across various positive and negative scales. The body action dataset includes annotated actions from some well-known databases and a self-created series of coordinate movements. At the first stage, neural network models such as CNN + LSTM, LSTM, and Action Transformer are used to train on these datasets, along with various attention mechanisms to confirm their efficacy. Facial expression identification is enhanced using open-source datasets and the DeepFace toolkit developed by the Meta AI Research Group. At the second stage, the emotional analysis is conducted by combining the outcomes of body action recognition and facial expression identification and then fed into kNN, SVM, or decision tree for positive and negative analysis. The experimental outcomes highlight several crucial points. The accuracy rates for action recognition using RNN + LSTM and Action Transformer are 81.1% and 88.4%, respectively. Moreover, DeepFace attains a facial expression identification accuracy of 97%. When consolidated for emotional analysis using kNN, the test accuracy improves significantly, showcasing its strong performance in various scenarios.
引用
收藏
页数:14
相关论文
共 39 条
[1]   Two Sides of Emotion: Exploring Positivity and Negativity in Six Basic Emotions across Cultures [J].
An, Sieun ;
Ji, Li-Jun ;
Marks, Michael ;
Zhang, Zhiyong .
FRONTIERS IN PSYCHOLOGY, 2017, 8
[2]  
[Anonymous], 2019, Noldus: FaceReader. Emotion analysis
[3]  
VB, 2014, Arxiv, DOI [arXiv:1404.0933, DOI 10.48550/ARXIV.1404.0933]
[4]  
Bahdanau D, 2016, Arxiv, DOI arXiv:1409.0473
[5]  
Bnziger T., 2007, Affective Computing and Intelligent Interaction. ACII 2007. Lecture Notes in Computer Science, V4738, DOI [10.1007/978-3-540-74889-242, DOI 10.1007/978-3-540-74889-242]
[6]  
Bochkovskiy A, 2020, Arxiv, DOI [arXiv:2004.10934, DOI 10.48550/ARXIV.2004.10934]
[7]  
Cao Z, 2019, Arxiv, DOI arXiv:1812.08008
[8]  
Cao Z, 2017, Arxiv, DOI arXiv:1611.08050
[9]  
Cristianini N., 2008, IEEE Intelligent Systems and their applications, P928, DOI [DOI 10.1109/5254.708428, 10.1007/978-0-387-30162-4415, DOI 10.1007/978-0-387-30162-4415]
[10]  
Cui MM, 2020, CHIN CONT DECIS CONF, P3294, DOI [10.1109/ccdc49329.2020.9164551, 10.1109/CCDC49329.2020.9164551]