FacialNet: facial emotion recognition for mental health analysis using UNet segmentation with transfer learning model

被引:0
作者
Na, In-Seop [1 ]
Aldrees, Asma [2 ]
Hakeem, Abeer [3 ]
Mohaisen, Linda [3 ]
Umer, Muhammad [4 ]
AlHammadi, Dina Abdulaziz [5 ]
Alsubai, Shtwai [6 ]
Innab, Nisreen [7 ]
Ashraf, Imran [8 ]
机构
[1] Division of Culture Contents, Chonnam National University, Yeosu
[2] Department of Informatics and Computer Systems, College of Computer Science, King Khalid University, Abha
[3] Department of Information Technology, Faculty of Computing and Information Technology, King Abdulaziz University, Jeddah
[4] Department of Computer Science & Information Technology, The Islamia University of Bahawalpur, Bahawalpur
[5] Department of Information Systems, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, Riyadh
[6] Department of Computer Science, College of Computer Engineering and Sciences, Prince Sattam Bin Abdulaziz University, Al-Kharj
[7] Department of Computer Science and Information Systems, College of Applied Sciences, AlMaarefa University, Diriyah
[8] Department of Information and Communication Engineering, Yeungnam University, Gyeongsan
来源
Frontiers in Computational Neuroscience | 2024年 / 18卷
关键词
EfficientNet; facial emotion recognition; image processing; transfer learning; UNET;
D O I
10.3389/fncom.2024.1485121
中图分类号
学科分类号
摘要
Facial emotion recognition (FER) can serve as a valuable tool for assessing emotional states, which are often linked to mental health. However, mental health encompasses a broad range of factors that go beyond facial expressions. While FER provides insights into certain aspects of emotional well-being, it can be used in conjunction with other assessments to form a more comprehensive understanding of an individual's mental health. This research work proposes a framework for human FER using UNet image segmentation and transfer learning with the EfficientNetB4 model (called FacialNet). The proposed model demonstrates promising results, achieving an accuracy of 90% for six emotion classes (happy, sad, fear, pain, anger, and disgust) and 96.39% for binary classification (happy and sad). The significance of FacialNet is judged by extensive experiments conducted against various machine learning and deep learning models, as well as state-of-the-art previous research works in FER. The significance of FacialNet is further validated using a cross-validation technique, ensuring reliable performance across different data splits. The findings highlight the effectiveness of leveraging UNet image segmentation and EfficientNetB4 transfer learning for accurate and efficient human facial emotion recognition, offering promising avenues for real-world applications in emotion-aware systems and effective computing platforms. Experimental findings reveal that the proposed approach performs substantially better than existing works with an improved accuracy of 96.39% compared to existing 94.26%. Copyright © 2024 Na, Aldrees, Hakeem, Mohaisen, Umer, AlHammadi, Alsubai, Innab and Ashraf.
引用
收藏
相关论文
共 34 条
  • [1] Anilkumar B., Lakshmi Devi N., Kotagiri S., Mary Sowjanya A., Design an image-based sentiment analysis system using a deep convolutional neural network and hyperparameter optimization, Multimed. Tools Appl, 83, pp. 1-20, (2024)
  • [2] Boughida A., Kouahla M.N., Lafifi Y., A novel approach for facial expression recognition based on gabor filters and genetic algorithm, Evol. Syst, 13, pp. 331-345, (2022)
  • [3] Canal F.Z., Muller T.R., Matias J.C., Scotton G.G., de Sa Junior A.R., Pozzebon E., Et al., A survey on facial emotion recognition techniques: a state-of-the-art literature review, Inf. Sci, 582, pp. 593-617, (2022)
  • [4] Cao X., Wang Z., Chen Y., Zhu J., Childhood maltreatment and resting-state network connectivity: the risk-buffering role of positive parenting, Dev. Psychopathol, (2024)
  • [5] Ding J., Chen X., Lu P., Yang Z., Li X., Du Y., Et al., Dialogueinab: an interaction neural network based on attitudes and behaviors of interlocutors for dialogue emotion recognition, J. Supercomput, 79, pp. 20481-20514, (2023)
  • [6] Gubbala K., Kumar M.N., Sowjanya A.M., Adaboost based random forest model for emotion classification of facial images, MethodsX, 11, (2023)
  • [7] Gupta S., Jain S., Feeling recognition by facial expression using deep learning, J. Phys. Conf. Ser, 1717, (2021)
  • [8] Haider I., Yang H.J., Lee G.S., Kim S.H., Robust human face emotion classification using triplet-loss-based deep cnn features and SVM, Sensors, 23, (2023)
  • [9] Hearst M.A., Dumais S.T., Osuna E., Platt J., Scholkopf B., Support vector machines, IEEE Intell. Syst. Appl, 13, pp. 18-28, (1998)
  • [10] Huang C.-C., Wu Y.-L., Tang C.-Y., “Human face sentiment classification using synthetic sentiment images with deep convolutional neural networks,”, 2019 International Conference on Machine Learning and Cybernetics (ICMLC), pp. 1-5, (2019)