Automated facial expression recognition using exemplar hybrid deep feature generation technique

被引:11
作者
Baygin, Mehmet [1 ]
Tuncer, Ilknur [2 ]
Dogan, Sengul [3 ]
Barua, Prabal Datta [4 ,5 ]
Tuncer, Turker [3 ]
Cheong, Kang Hao [6 ]
Acharya, U. Rajendra [7 ]
机构
[1] Ardahan Univ, Fac Engn, Dept Comp Engn, Ardahan, Turkiye
[2] Interior Minist, Elazig, Turkiye
[3] Firat Univ, Technol Fac, Dept Digital Forens Engn, Elazig, Turkiye
[4] Univ Southern Queensland, Sch Business Informat Syst, Toowoomba, Qld 4350, Australia
[5] Univ Technol Sydney, Fac Engn & Informat Technol, Sydney, NSW 2007, Australia
[6] Singapore Univ Technol & Design, Sci Math & Technol Cluster, Singapore 487372, Singapore
[7] Univ Southern Queensland, Sch Math Phys & Comp, Springfield, Australia
关键词
Facial expression recognition; Exemplar deep feature; Neighbor component analysis; Emotion detection; FEATURE-SELECTION;
D O I
10.1007/s00500-023-08230-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The perception and recognition of emotional expressions provide essential information about individuals' social behavior. Therefore, decoding emotional expressions is very important. Facial expression recognition (FER) is one of the most frequently studied topics. An accurate FER model has four prime phases. (i) Facial areas are segmented from the face images. (ii) An exemplar deep feature-based model is proposed. Two pretrained deep models (AlexNet and MobileNetV2) are utilized as feature generators. By merging both pretrained networks, a feature generation function is presented. (iii) The most valuable 1000 features are selected by neighborhood component analysis (NCA). (iv) These 1000 features are selected on a support vector machine (SVM). We have developed our model using five FER corpora: TFEID, JAFFE, KDEF, CK+, and Oulu-CASIA. Our developed model is able to yield an accuracy of 97.01, 98.59, 96.54, 100, and 100%, using TFEID, JAFFE, KDEF, CK+, and Oulu-CASIA, respectively. The results obtained in this study showed that the proposed exemplar deep feature extraction approach has obtained high success rates in the automatic FER method using various databases.
引用
收藏
页码:8721 / 8737
页数:17
相关论文
共 47 条
[1]   A systematic survey on multimodal emotion recognition using learning algorithms [J].
Ahmed, Naveed ;
Al Aghbari, Zaher ;
Girija, Shini .
INTELLIGENT SYSTEMS WITH APPLICATIONS, 2023, 17
[2]   Facial Emotion Recognition Using Transfer Learning in the Deep CNN [J].
Akhand, M. A. H. ;
Roy, Shuvendu ;
Siddique, Nazmul ;
Kamal, Md Abdus Samad ;
Shimamura, Tetsuya .
ELECTRONICS, 2021, 10 (09)
[3]   A survey on facial emotion recognition techniques: A state-of-the-art literature review [J].
Canal, Felipe Zago ;
Mueller, Tobias Rossi ;
Matias, Jhennifer Cristine ;
Scotton, Gustavo Gino ;
de Sa, Antonio Reis ;
Pozzebon, Eliane ;
Sobieranski, Antonio Carlos .
INFORMATION SCIENCES, 2022, 582 :593-617
[4]  
Celniak Weronika, 2022, Information Technology in Biomedicine: 9th International Conference, ITIB 2022, Proceedings. Advances in Intelligent Systems and Computing (1429), P66, DOI 10.1007/978-3-031-09135-3_6
[5]   Performance enhancement of facial electromyogram-based facial-expression recognition for social virtual reality applications using linear discriminant analysis adaptation [J].
Cha, Ho-Seung ;
Im, Chang-Hwan .
VIRTUAL REALITY, 2022, 26 (01) :385-398
[6]   Deep learning-based facial emotion recognition for human-computer interaction applications [J].
Chowdary, M. Kalpana ;
Nguyen, Tu N. ;
Hemanth, D. Jude .
NEURAL COMPUTING & APPLICATIONS, 2023, 35 (32) :23311-23328
[7]  
Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848
[8]   Minimum redundancy feature selection from microarray gene expression data [J].
Ding, C ;
Peng, HC .
PROCEEDINGS OF THE 2003 IEEE BIOINFORMATICS CONFERENCE, 2003, :523-528
[9]   Human Emotion Recognition: Review of Sensors and Methods [J].
Dzedzickis, Andrius ;
Kaklauskas, Arturas ;
Bucinskas, Vytautas .
SENSORS, 2020, 20 (03)
[10]  
Ekman P., 2006, Darwin and facial expression: A century of research in review, DOI DOI 10.1371/J0URNAL.P0NE.0014679