Lightweight Human Pose Estimation Based on Densely Guided Self-Knowledge Distillation

被引:2
|
作者
Wu, Mingyue [1 ,2 ]
Zhao, Zhong-Qiu [1 ,2 ,3 ]
Li, Jiajun [1 ,2 ]
Tian, Weidong [1 ,2 ,3 ]
机构
[1] Hefei Univ Technol, Sch Comp Sci & Informat Engn, Hefei 230009, Peoples R China
[2] Intelligent Mfg Inst HFUT, Hefei, Peoples R China
[3] Guangxi Acad Sci, Nanning, Peoples R China
基金
中国国家自然科学基金;
关键词
Pose estimation; Konwledge distillation; Binarization operation;
D O I
10.1007/978-3-031-44210-0_34
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The current human pose estimation network has difficulty to be deployed on lightweight devices due to its large number of parameters. An effective solution is knowledge distillation, but there still exists the problem of insufficient learning ability of the student network: (1) There is an error avalanche problem in multi-teacher distillation. (2) There exists noise in heatmaps generated by teachers, which causes model degradation. (3) The effect of self-knowledge distillation is ignored. (4) Pose estimation is considered to be a regression problem but people usually ignore that it is also a classification problem. To address the above problems, we propose a densely guided self-knowledge distillation framework named DSKD to solve the error avalanche problem, propose a binarization operation to reduce the noise of the teacher's heatmaps, and add a classification loss to the total loss to guide student's learning. Experimental results show that our method effectively improves the performance of different lightweight models.
引用
收藏
页码:421 / 433
页数:13
相关论文
共 50 条
  • [1] A Lightweight Approach for Network Intrusion Detection based on Self-Knowledge Distillation
    Yang, Shuo
    Zheng, Xinran
    Xu, Zhengzhuo
    Wang, Xingjun
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3000 - 3005
  • [2] Adaptive lightweight network construction method for Self-Knowledge Distillation
    Lu, Siyuan
    Zeng, Weiliang
    Li, Xueshi
    Ou, Jiajun
    NEUROCOMPUTING, 2025, 624
  • [3] Lightweight densely connected residual network for human pose estimation
    Yang, Lianping
    Qin, Yu
    Zhang, Xiangde
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2021, 18 (03) : 825 - 837
  • [4] Lightweight densely connected residual network for human pose estimation
    Lianping Yang
    Yu Qin
    Xiangde Zhang
    Journal of Real-Time Image Processing, 2021, 18 : 825 - 837
  • [5] EFFECTIVE KNOWLEDGE DISTILLATION FOR HUMAN POSE ESTIMATION
    Zhou, Yang
    Gu, Xiaofeng
    Fu, Hong
    Li, Na
    Du, Xuemei
    Kuang, Ping
    2019 16TH INTERNATIONAL COMPUTER CONFERENCE ON WAVELET ACTIVE MEDIA TECHNOLOGY AND INFORMATION PROCESSING (ICWAMTIP), 2019, : 170 - 173
  • [6] A Lightweight Convolution Network with Self-Knowledge Distillation for Hyperspectral Image Classification
    Xu, Hao
    Cao, Guo
    Deng, Lindiao
    Ding, Lanwei
    Xu, Ling
    Pan, Qikun
    Shang, Yanfeng
    FOURTEENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING, ICGIP 2022, 2022, 12705
  • [7] Human Pose Estimation via an Ultra-Lightweight Pose Distillation Network
    Zhang, Shihao
    Qiang, Baohua
    Yang, Xianyi
    Wei, Xuekai
    Chen, Ruidong
    Chen, Lirui
    ELECTRONICS, 2023, 12 (12)
  • [8] Neighbor self-knowledge distillation
    Liang, Peng
    Zhang, Weiwei
    Wang, Junhuang
    Guo, Yufeng
    INFORMATION SCIENCES, 2024, 654
  • [9] Lightweight Multiperson Pose Estimation With Staggered Alignment Self-Distillation
    Fan, Zhenkun
    Huang, Zhuoxu
    Chen, Zhixiang
    Xu, Tao
    Han, Jungong
    Kittler, Josef
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 9228 - 9240
  • [10] Self-knowledge distillation based on dynamic mixed attention
    Tang, Yuan
    Chen, Ying
    Kongzhi yu Juece/Control and Decision, 2024, 39 (12): : 4099 - 4108