Lightweight Human Pose Estimation Based on Densely Guided Self-Knowledge Distillation

被引:2
|
作者
Wu, Mingyue [1 ,2 ]
Zhao, Zhong-Qiu [1 ,2 ,3 ]
Li, Jiajun [1 ,2 ]
Tian, Weidong [1 ,2 ,3 ]
机构
[1] Hefei Univ Technol, Sch Comp Sci & Informat Engn, Hefei 230009, Peoples R China
[2] Intelligent Mfg Inst HFUT, Hefei, Peoples R China
[3] Guangxi Acad Sci, Nanning, Peoples R China
基金
中国国家自然科学基金;
关键词
Pose estimation; Konwledge distillation; Binarization operation;
D O I
10.1007/978-3-031-44210-0_34
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The current human pose estimation network has difficulty to be deployed on lightweight devices due to its large number of parameters. An effective solution is knowledge distillation, but there still exists the problem of insufficient learning ability of the student network: (1) There is an error avalanche problem in multi-teacher distillation. (2) There exists noise in heatmaps generated by teachers, which causes model degradation. (3) The effect of self-knowledge distillation is ignored. (4) Pose estimation is considered to be a regression problem but people usually ignore that it is also a classification problem. To address the above problems, we propose a densely guided self-knowledge distillation framework named DSKD to solve the error avalanche problem, propose a binarization operation to reduce the noise of the teacher's heatmaps, and add a classification loss to the total loss to guide student's learning. Experimental results show that our method effectively improves the performance of different lightweight models.
引用
收藏
页码:421 / 433
页数:13
相关论文
共 50 条
  • [41] Lightweight human pose estimation based on adaptive feature sensing
    Wu Ning
    Wang Peng
    Li Xiao-yan
    Lu Zhi-gang
    Sun Meng-yu
    CHINESE JOURNAL OF LIQUID CRYSTALS AND DISPLAYS, 2023, 38 (08) : 1107 - 1117
  • [42] Efficient Pose Estimation via a Lightweight Single-Branch Pose Distillation Network
    Zhang, Shihao
    Qiang, Baohua
    Yang, Xianyi
    Zhou, Mingliang
    Chen, Ruidong
    Chen, Lirui
    IEEE SENSORS JOURNAL, 2023, 23 (22) : 27709 - 27719
  • [43] ROBUST AND ACCURATE OBJECT DETECTION VIA SELF-KNOWLEDGE DISTILLATION
    Xu, Weipeng
    Chu, Pengzhi
    Xie, Renhao
    Xiao, Xiongziyan
    Huang, Hongcheng
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 91 - 95
  • [44] SELF-KNOWLEDGE DISTILLATION VIA FEATURE ENHANCEMENT FOR SPEAKER VERIFICATION
    Liu, Bei
    Wang, Haoyu
    Chen, Zhengyang
    Wang, Shuai
    Qian, Yanmin
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 7542 - 7546
  • [45] MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition
    Yang, Chuanguang
    An, Zhulin
    Zhou, Helong
    Cai, Linhang
    Zhi, Xiang
    Wu, Jiwen
    Xu, Yongjun
    Zhang, Qian
    COMPUTER VISION, ECCV 2022, PT XXIV, 2022, 13684 : 534 - 551
  • [46] Personalized Edge Intelligence via Federated Self-Knowledge Distillation
    Jin, Hai
    Bai, Dongshan
    Yao, Dezhong
    Dai, Yutong
    Gu, Lin
    Yu, Chen
    Sun, Lichao
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (02) : 567 - 580
  • [47] Self-Knowledge Distillation for First Trimester Ultrasound Saliency Prediction
    Gridach, Mourad
    Savochkina, Elizaveta
    Drukker, Lior
    Papageorghiou, Aris T.
    Noble, J. Alison
    SIMPLIFYING MEDICAL ULTRASOUND, ASMUS 2022, 2022, 13565 : 117 - 127
  • [48] Research on Lightweight High-resolution Network Human Pose Estimation Based on Self-attention
    Liu, Guangyu
    Zhong, Xiaoling
    Ma, Lizhi
    2023 IEEE 8TH INTERNATIONAL CONFERENCE ON BIG DATA ANALYTICS, ICBDA, 2023, : 142 - 146
  • [49] Automatic Diabetic Retinopathy Grading via Self-Knowledge Distillation
    Luo, Ling
    Xue, Dingyu
    Feng, Xinglong
    ELECTRONICS, 2020, 9 (09) : 1 - 13
  • [50] Decoupled Feature and Self-Knowledge Distillation for Speech Emotion Recognition
    Yu, Haixiang
    Ning, Yuan
    IEEE ACCESS, 2025, 13 : 33275 - 33285