Toward Efficient and Robust Metrics for RANSAC Hypotheses and 3D Rigid Registration

被引:29
作者
Yang, Jiaqi [1 ]
Huang, Zhiqiang [2 ]
Quan, Siwen [3 ]
Zhang, Qian [4 ]
Zhang, Yanning [1 ]
Cao, Zhiguo [5 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Natl Engn Lab Integrated AeroSp Ground Ocean Big, Xian 710129, Peoples R China
[2] Northwestern Polytech Univ, Sch Software, Xian 710129, Peoples R China
[3] Changan Univ, Sch Elect & Control Engn, Xian 710064, Peoples R China
[4] Hubei Univ, Sch Resources & Environm, Wuhan 430062, Peoples R China
[5] Huazhong Univ Sci & Technol, Key Lab Sci & Technol Multispectral Informat Proc, Sch Artificial Intelligence & Automat, Wuhan 430074, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Three-dimensional displays; Measurement; Robustness; Feature extraction; Pose estimation; Pipelines; Standards; 3D point cloud; 3D rigid registration; pose estimation; hypothesis evaluation; OBJECT RECOGNITION; POINT CLOUDS; UNIQUE SIGNATURES; URBAN SCENES; SURFACE; HISTOGRAMS; ALGORITHM; FEATURES; REPRESENTATION; CONSENSUS;
D O I
10.1109/TCSVT.2021.3062811
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper focuses on developing efficient and robust evaluation metrics for RANSAC hypotheses to achieve accurate 3D rigid registration. Estimating six-degree-of-freedom (6-DoF) pose from feature correspondences remains a popular approach to 3D rigid registration, where random sample consensus (RANSAC) is a well-known solution to this problem. However, existing metrics for RANSAC hypotheses are either time-consuming or sensitive to common nuisances, parameter variations, and different application scenarios, resulting in performance deterioration with respect to overall registration accuracy and speed. We alleviate this problem by first analyzing the contributions of inliers and outliers and then proposing several efficient and robust metrics with different designing motivations for RANSAC hypotheses. Comparative experiments on four standard datasets with different nuisances and application scenarios verify that our considered metrics can significantly improve the registration performance and are more robust than several state-of-the-art competitors, making them good gifts to practical applications. This work also draws an interesting conclusion, i.e., not all inliers are equal while all outliers should be equal, which may shed new light on this research problem.
引用
收藏
页码:893 / 906
页数:14
相关论文
共 59 条
[1]   4-points congruent sets for robust pairwise surface registration [J].
Aiger, Dror ;
Mitra, Niloy J. ;
Cohen-Or, Daniel .
ACM TRANSACTIONS ON GRAPHICS, 2008, 27 (03)
[2]   PointNetLK: Robust & Efficient Point Cloud Registration using PointNet [J].
Aoki, Yasuhiro ;
Goforth, Hunter ;
Srivatsan, Rangaprasad Arun ;
Lucey, Simon .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :7156-7165
[3]   Graph-Cut RANSAC [J].
Barath, Daniel ;
Matas, Jiri .
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, :6733-6741
[4]   A METHOD FOR REGISTRATION OF 3-D SHAPES [J].
BESL, PJ ;
MCKAY, ND .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1992, 14 (02) :239-256
[5]   SPOT Sliced Partial Optimal Transport [J].
Bonneel, Nicolas ;
Coeurjolly, David .
ACM TRANSACTIONS ON GRAPHICS, 2019, 38 (04)
[6]   Sparse Iterative Closest Point [J].
Bouaziz, Sofien ;
Tagliasacchi, Andrea ;
Pauly, Mark .
COMPUTER GRAPHICS FORUM, 2013, 32 (05) :113-123
[7]  
Buch AndersGlent., 2018, P BMVC, P143
[8]   PLADE: A Plane-Based Descriptor for Point Cloud Registration With Small Overlap [J].
Chen, Songlin ;
Nan, Liangliang ;
Xia, Renbo ;
Zhao, Jibin ;
Wonka, Peter .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2020, 58 (04) :2530-2540
[9]   OBJECT MODELING BY REGISTRATION OF MULTIPLE RANGE IMAGES [J].
CHEN, Y ;
MEDIONI, G .
IMAGE AND VISION COMPUTING, 1992, 10 (03) :145-155
[10]  
CHEN Zhiming, 2020, Lect. Notes Comput. Sci. (Incl. Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinforma.), P195, DOI [10.1007/978-3-030-58558-7_12, DOI 10.1007/978-3-030-58558, DOI 10.1007/978-3-030-58558-712]