Pose Partition Networks for Multi-person Pose Estimation

被引:48
作者
Nie, Xuecheng [1 ]
Feng, Jiashi [1 ]
Xing, Junliang [2 ]
Yan, Shuicheng [1 ,3 ]
机构
[1] Natl Univ Singapore, ECE Dept, Singapore, Singapore
[2] Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
[3] Qihoo 360 AI Inst, Beijing, Peoples R China
来源
COMPUTER VISION - ECCV 2018, PT V | 2018年 / 11209卷
关键词
Multi-person pose estimation; Pose partition; Dense regression;
D O I
10.1007/978-3-030-01228-1_42
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper proposes a novel Pose Partition Network (PPN) to address the challenging multi-person pose estimation problem. The proposed PPN is favorably featured by low complexity and high accuracy of joint detection and partition. In particular, PPN performs dense regressions from global joint candidates within a specific embedding space, which is parameterized by centroids of persons, to efficiently generate robust person detection and joint partition. Then, PPN infers body joint configurations through conducting graph partition for each person detection locally, utilizing reliable global affinity cues. In this way, PPN reduces computation complexity and improves multi-person pose estimation significantly. We implement PPN with the Hourglass architecture as the backbone network to simultaneously learn joint detector and dense regressor. Extensive experiments on benchmarks MPII Human Pose Multi-Person, extended PASCAL-Person-Part, and WAF show the efficiency of PPN with new state-of-the-art performance.
引用
收藏
页码:705 / 720
页数:16
相关论文
共 28 条
  • [1] [Anonymous], 2013, CVPR
  • [2] [Anonymous], 2015, P 28 INT C NEUR INF
  • [3] [Anonymous], 2009, ICCV
  • [4] [Anonymous], 2016, LECT NOTES COMPUT SC, DOI DOI 10.1007/978-3-319-46484-8_29
  • [5] [Anonymous], 2017, ICCV
  • [6] [Anonymous], 2012, COURSERA NEURAL NETW
  • [7] [Anonymous], 2017, P IEEE C COMP VIS PA, DOI DOI 10.1109/CVPR.2017.143
  • [8] [Anonymous], 2016, CVPR
  • [9] [Anonymous], 2015, P IEEE INT C COMP VI
  • [10] [Anonymous], 2017, PYTORCH