An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network

被引:2
作者
Park, Changhyun [1 ]
Lee, Hean Sung [1 ]
Kim, Woo Jin [1 ]
Bae, Han Byeol [2 ]
Lee, Jaeho [1 ]
Lee, Sangyoun [1 ]
机构
[1] Yonsei Univ, Dept Elect & Elect Engn, Seoul 03722, South Korea
[2] Kwangju Womens Univ, Dept Artificial Intelligence Convergence, Gwangju 62396, South Korea
关键词
pose estimation; convolutional neural network; lightweight; knowledge distillation; POSE ESTIMATION; SYSTEM;
D O I
10.3390/s21227640
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Multi-person pose estimation has been gaining considerable interest due to its use in several real-world applications, such as activity recognition, motion capture, and augmented reality. Although the improvement of the accuracy and speed of multi-person pose estimation techniques has been recently studied, limitations still exist in balancing these two aspects. In this paper, a novel knowledge distilled lightweight top-down pose network (KDLPN) is proposed that balances computational complexity and accuracy. For the first time in multi-person pose estimation, a network that reduces computational complexity by applying a "Pelee" structure and shuffles pixels in the dense upsampling convolution layer to reduce the number of channels is presented. Furthermore, to prevent performance degradation because of the reduced computational complexity, knowledge distillation is applied to establish the pose estimation network as a teacher network. The method performance is evaluated on the MSCOCO dataset. Experimental results demonstrate that our KDLPN network significantly reduces 95% of the parameters required by state-of-the-art methods with minimal performance degradation. Moreover, our method is compared with other pose estimation methods to substantiate the importance of computational complexity reduction and its effectiveness.
引用
收藏
页数:18
相关论文
共 62 条