共 47 条
[41]
Zhang J., Badcleaner: Defending backdoor attacks in federated learning via attention-based multi-teacher distillation, IEEE Trans. Dependable Secur. Comput, 21, pp. 4559-4573, (2024)
[42]
Bai L., An information-theoretical framework for cluster ensemble, IEEE Trans. Knowl. Data Eng, 31, pp. 1464-1477, (2018)
[43]
Lee K., Fast and accurate facial expression image classification and regression method based on knowledge distillation, Appl. Sci, 13, (2023)
[44]
Bai S., An empirical evaluation of generic convolutional and recurrent networks for sequence modeling, arXiv, (2018)
[45]
Saputra M.R.U., Distilling knowledge from a deep pose regressor network, Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 263-272
[46]
Chen G., Choi W., Yu X., Han T., Chandraker M., Learning efficient object detection models with knowledge distillation, Adv. Neural Inf. Process. Syst, 30, pp. 742-751, (2017)
[47]
Takamoto M., An efficient method of training small models for regression problems with knowledge distillation, Proceedings of the 2020 IEEE Conference on Multimedia Information Processing and Retrieval (MIPR), pp. 67-72