A study on computer vision for facial emotion recognition

被引:0
作者
Zi-Yu Huang
Chia-Chin Chiang
Jian-Hao Chen
Yi-Chian Chen
Hsin-Lung Chung
Yu-Ping Cai
Hsiu-Chuan Hsu
机构
[1] National Kaohsiung University of Science and Technology,Department of Mechanical Engineering
[2] National Chengchi University,Graduate Institute of Applied Physics
[3] Fooyin University,Department of Occupational Safety and Hygiene
[4] Hsin Sheng Junior College of Medical Care and Management,Department of Nursing
[5] National Chengchi University,Department of Computer Science
来源
Scientific Reports | / 13卷
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Artificial intelligence has been successfully applied in various fields, one of which is computer vision. In this study, a deep neural network (DNN) was adopted for Facial emotion recognition (FER). One of the objectives in this study is to identify the critical facial features on which the DNN model focuses for FER. In particular, we utilized a convolutional neural network (CNN), the combination of squeeze-and-excitation network and the residual neural network, for the task of FER. We utilized AffectNet and the Real-World Affective Faces Database (RAF-DB) as the facial expression databases that provide learning samples for the CNN. The feature maps were extracted from the residual blocks for further analysis. Our analysis shows that the features around the nose and mouth are critical facial landmarks for the neural networks. Cross-database validations were conducted between the databases. The network model trained on AffectNet achieved 77.37% accuracy when validated on the RAF-DB, while the network model pretrained on AffectNet and then transfer learned on the RAF-DB results in validation accuracy of 83.37%. The outcomes of this study would improve the understanding of neural networks and assist with improving computer vision accuracy.
引用
收藏
相关论文
共 61 条
[1]  
Vo TH(2020)Pyramid with super resolution for in-the-wild facial expression recognition IEEE Access 8 131988-132001
[2]  
Lee GS(2022)Human-computer interaction for recognizing speech emotions using multilayer perceptron classifier J. Healthc. Eng. 2022 6005446-553
[3]  
Yang HJ(1992)Are there basic emotions? Psychol. Rev. 99 550-1178
[4]  
Kim SH(1980)A circumplex model of affect J. Pers. Soc. Psychol. 39 1161-1215
[5]  
Alnuaim AA(2022)Automated emotion recognition: Current trends and future perspectives Comput. Method Prog. Biomed. 215 106646-617
[6]  
Ekman P(2022)Deep facial expression recognition: A survey IEEE Trans. Affect. Comput. 13 1195-31
[7]  
Russell JA(2022)A survey on facial emotion recognition techniques: A state-of-the-art literature review Inf. Sci. 582 593-1960
[8]  
Maithri M(2019)AffectNet: A database for facial expression, valence, and arousal computing in the wild IEEE Trans. Affect. Comput. 10 18-2143
[9]  
Li S(2022)Transformer-based interactive multi-modal attention network for video sentiment detection Neural Process. Lett. 54 1943-7
[10]  
Deng W(2022)Classifying emotions and engagement in online learning based on a single facial expression recognition neural network IEEE Trans. Affect. Comput. 13 2132-96065