Facial landmark points detection using knowledge distillation-based neural networks

被引:14
作者
Fard, Ali Pourramezan [1 ]
Mahoor, Mohammad H. [1 ]
机构
[1] Univ Denver, Dept Elect & Comp Engn, 2155 E Wesley Ave, Denver, CO 80208 USA
关键词
Deep learning; Face alignment; Facial landmark points detection; Knowledge distillation; FACE ALIGNMENT;
D O I
10.1016/j.cviu.2021.103316
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Facial landmark detection is a vital step for numerous facial image analysis applications. Although some deep learning-based methods have achieved good performances in this task, they are often not suitable for running on mobile devices. Such methods rely on networks with many parameters, which makes the training and inference time-consuming. Training lightweight neural networks such as MobileNets are often challenging, and the models might have low accuracy. Inspired by knowledge distillation (KD), this paper presents a novel loss function to train a lightweight Student network (e.g., MobileNetV2) for facial landmark detection. We use two Teacher networks, a Tolerant-Teacher and a Tough-Teacher in conjunction with the Student network. The Tolerant-Teacher is trained using Soft-landmarks created by active shape models, while the Tough-Teacher is trained using the ground truth (aka Hard-landmarks) landmark points. To utilize the facial landmark points predicted by the Teacher networks, we define an Assistive Loss (ALoss) for each Teacher network. Moreover, we define a loss function called KD-Loss that utilizes the facial landmark points predicted by the two pre-trained Teacher networks (EfficientNet-b3) to guide the lightweight Student network towards predicting the Hard landmarks. Our experimental results on three challenging facial datasets show that the proposed architecture will result in a better-trained Student network that can extract facial landmark points with high accuracy.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] Homogeneous teacher based buffer knowledge distillation for tiny neural networks
    Dai, Xinru
    Lu, Gang
    Shen, Jianhua
    Huang, Shuo
    Wei, Tongquan
    JOURNAL OF SYSTEMS ARCHITECTURE, 2024, 148
  • [32] Feature Distribution-based Knowledge Distillation for Deep Neural Networks
    Hong, Hyeonseok
    Kim, Hyun
    2022 19TH INTERNATIONAL SOC DESIGN CONFERENCE (ISOCC), 2022, : 75 - 76
  • [33] Feature knowledge distillation-based model lightweight for prohibited item detection in X-ray security inspection images
    Ren, Yu
    Zhao, Lun
    Zhang, Yongtao
    Liu, Yiyao
    Yang, Jinfeng
    Zhang, Haigang
    Lei, Baiying
    ADVANCED ENGINEERING INFORMATICS, 2025, 65
  • [34] Knowledge distillation on neural networks for evolving graphs
    Antaris, Stefanos
    Rafailidis, Dimitrios
    Girdzijauskas, Sarunas
    SOCIAL NETWORK ANALYSIS AND MINING, 2021, 11 (01)
  • [35] Knowledge distillation on neural networks for evolving graphs
    Stefanos Antaris
    Dimitrios Rafailidis
    Sarunas Girdzijauskas
    Social Network Analysis and Mining, 2021, 11
  • [36] Lightweight Alpha Matting Network Using Distillation-Based Channel Pruning
    Yoon, Donggeun
    Park, Jinsun
    Cho, Donghyeon
    COMPUTER VISION - ACCV 2022, PT III, 2023, 13843 : 103 - 119
  • [37] Robust Facial Landmark Detection via Occlusion-adaptive Deep Networks
    Zhu, Meilu
    Shi, Daming
    Zheng, Mingjie
    Sadiq, Muhammad
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 3481 - 3491
  • [38] Robust Facial Landmark Detection via Recurrent Attentive-Refinement Networks
    Xiao, Shengtao
    Feng, Jiashi
    Xing, Junliang
    Lai, Hanjiang
    Yan, Shuicheng
    Kassim, Ashraf
    COMPUTER VISION - ECCV 2016, PT I, 2016, 9905 : 57 - 72
  • [39] Rectified Wing Loss for Efficient and Robust Facial Landmark Localisation with Convolutional Neural Networks
    Feng, Zhen-Hua
    Kittler, Josef
    Awais, Muhammad
    Wu, Xiao-Jun
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2020, 128 (8-9) : 2126 - 2145
  • [40] KD-INR: Time-Varying Volumetric Data Compression via Knowledge Distillation-Based Implicit Neural Representation
    Han, Jun
    Zheng, Hao
    Bi, Chongke
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2024, 30 (10) : 6826 - 6838