Pyramid U-Network for Skeleton Extraction from Shape Points

被引:13
作者
Atienza, Rowel [1 ]
机构
[1] Univ Philippines, Elect & Elect Engn Inst, Quezon City, Philippines
来源
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2019) | 2019年
关键词
RECOGNITION;
D O I
10.1109/CVPRW.2019.00155
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The knowledge about the skeleton of a given geometric shape has many practical applications such as shape animation, shape comparison, shape recognition, and estimating structural strength. Skeleton extraction becomes a more challenging problem when the topology is represented in point cloud domain. In this paper, we present the network architecture, PSPU-SkelNet, for TeamPH which ranked 3rd in Point SkelNetOn 2019 challenge [2]. PSPU-SkelNet is a pyramid of three U-Nets that predicts the skeleton from a given shape point cloud. PSPU-SkelNet achieves a Chamfer Distance (CD) of 2.9105 on the final test dataset. The code of PSPU-SkelNet is available at https://github.com/roatienzalskelnet.
引用
收藏
页码:1177 / 1180
页数:4
相关论文
共 16 条
[1]  
[Anonymous], 2016, PROC CVPR IEEE, DOI DOI 10.1109/CVPR.2016.265
[2]  
[Anonymous], 2017, IEEE P COMPUT VIS PA, DOI DOI 10.1109/CVPR.2017.16
[3]   Analysis of two-dimensional non-rigid shapes [J].
Bronstein, Alexander M. ;
Bronstein, Michael M. ;
Bruckstein, Alfred M. ;
Kimmel, Ron .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2008, 78 (01) :67-88
[4]  
Demir Ilke, 2019, SKELNETON 2019 DATAS
[5]   The Propagated Skeleton: A Robust Detail-Preserving Approach [J].
Durix, Bastien ;
Chambon, Sylvie ;
Leonard, Kathryn ;
Mari, Jean-Luc ;
Morin, Geraldine .
DISCRETE GEOMETRY FOR COMPUTER IMAGERY, DGCI 2019, 2019, 11414 :343-354
[6]  
Goodfellow IJ, 2014, ADV NEUR IN, V27, P2672
[7]  
Isola P, 2017, PROC CVPR IEEE, P1125, DOI DOI 10.1109/CVPR.2017.632
[8]  
Kingma DP, 2014, ARXIV
[9]  
Leborgne A, 2014, LECT NOTES COMPUT SC, V8887, P293, DOI 10.1007/978-3-319-14249-4_28
[10]  
Leonard K, 2016, INT C PATT RECOG, P3216, DOI 10.1109/ICPR.2016.7900130