Masked autoencoder of multi-scale convolution strategy combined with knowledge distillation for facial beauty prediction

被引:0
作者
Gan, Junying [1 ]
Xiong, Junling [1 ]
机构
[1] Wuyi Univ, Sch Elect Informat Engn, Jiangmen 529020, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
10.1038/s41598-025-86831-0
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Facial beauty prediction (FBP) is a leading area of research in artificial intelligence. Currently, there is a small amount of labeled data and a large amount of unlabeled data in the FBP database. The features extracted by the model based on supervised training are limited, resulting in low prediction accuracy. Masked autoencoder (MAE) is a self-supervised learning method that outperforms supervised learning methods without relying on large-scale databases. The MAE can improve the feature extraction ability of the model effectively. The multi-scale convolution strategy can expand the receptive field and combine the attention mechanism of the MAE to capture the dependency between distant pixels and acquire shallow and deep image features. Knowledge distillation can take the abundant knowledge from the teacher net to the student net, reduce the number of parameters, and compress the model. In this paper, the MAE of the multi-scale convolution strategy is combined with knowledge distillation for FBP. First, the MAE model with a multi-scale convolution strategy is constructed and used in the teacher net for pretraining. Second, the MAE model is constructed for the student net. Finally, the teacher net performs knowledge distillation, and the student net receives the loss function transmitted from the teacher net for optimization. The experimental results show that the proposed method outperforms other methods on the FBP task, improves FBP accuracy, and can be widely applied in tasks such as image classification.
引用
收藏
页数:17
相关论文
共 50 条
[41]   Multimodal hate speech detection via multi-scale visual kernels and knowledge distillation architecture [J].
Chhabra, Anusha ;
Vishwakarma, Dinesh Kumar .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 126
[42]   A rolling bearing fault diagnosis method based on multi-scale knowledge distillation and continual learning [J].
Xia, Yifei ;
Gao, Jun ;
Shao, Xing ;
Wang, Cuixiang .
Zhendong yu Chongji/Journal of Vibration and Shock, 2024, 43 (12) :276-285
[43]   Optimization of process-specific catalytic packing in catalytic distillation process: A multi-scale strategy [J].
Wang, Qinglian ;
Yang, Chen ;
Wang, Hongxing ;
Qiu, Ting .
CHEMICAL ENGINEERING SCIENCE, 2017, 174 :472-486
[44]   From underwater to drone: A novel multi-scale knowledge distillation approach for coral reef monitoring [J].
Contini, Matteo ;
Illien, Victor ;
Barde, Julien ;
Poulain, Sylvain ;
Bernard, Serge ;
Joly, Alexis ;
Bonhommeau, Sylvain .
ECOLOGICAL INFORMATICS, 2025, 89
[45]   SEGMTM: a spectrum prediction method based on enhanced graph convolution and multi-scale time decomposition [J].
Meng, Yong ;
Chen, Suting ;
Lu, Xinyu ;
Xu, Wenliang ;
Shi, Zhenxing ;
Zhou, Xuefen .
MULTIMEDIA SYSTEMS, 2025, 31 (03)
[46]   A Multi-Scale Convolutional Neural Network with Self-Knowledge Distillation for Bearing Fault Diagnosis [J].
Yu, Jiamao ;
Hu, Hexuan .
MACHINES, 2024, 12 (11)
[47]   A Multi-scale Convolution and Gated Recurrent Unit Based Network for Limit Order Book Prediction [J].
Xu, Borui ;
Zhang, Tong ;
Liu, Weiguo .
KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2022, 13368 :71-84
[48]   Rolling Bearing Life Prediction Based on Improved Transformer Encoding Layer and Multi-Scale Convolution [J].
Luo, Zhuopeng ;
Wang, Zhihai ;
Liu, Xiaoqin ;
Yang, Yingming .
MACHINES, 2025, 13 (06)
[49]   MSR-GCN: Multi-Scale Residual Graph Convolution Networks for Human Motion Prediction [J].
Dang, Lingwei ;
Nie, Yongwei ;
Long, Chengjiang ;
Zhang, Qing ;
Li, Guiqing .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :11447-11456
[50]   HiMTM: Hierarchical Multi-Scale Masked Time Series Modeling with Self-Distillation for Long-Term Forecasting [J].
Zhao, Shubao ;
Jin, Ming ;
Hou, Zhaoxiang ;
Yang, Chengyi ;
Li, Zengxiang ;
Wen, Qingsong ;
Wang, Yi .
PROCEEDINGS OF THE 33RD ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2024, 2024, :3352-3362