Robust Point Matching via Vector Field Consensus

被引:512
作者
Ma, Jiayi [1 ]
Zhao, Ji [2 ]
Tian, Jinwen [1 ]
Yuille, Alan L. [3 ]
Tu, Zhuowen [4 ]
机构
[1] Huazhong Univ Sci & Technol, Sch Automat, Natl Key Lab Sci & Technol Multispectral Informat, Wuhan 430074, Hubei, Peoples R China
[2] Carnegie Mellon Univ, Inst Robot, Pittsburgh, PA 15213 USA
[3] Univ Calif Los Angeles, Dept Stat, Los Angeles, CA 90095 USA
[4] Univ Calif San Diego, Dept Cognit Sci, La Jolla, CA 92697 USA
基金
美国国家科学基金会; 中国国家自然科学基金;
关键词
Point correspondence; outlier removal; matching; regularization; STATISTICAL PHYSICS; OBJECT RECOGNITION; TRANSFORMATION; APPROXIMATION; MODELS;
D O I
10.1109/TIP.2014.2307478
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose an efficient algorithm, called vector field consensus, for establishing robust point correspondences between two sets of points. Our algorithm starts by creating a set of putative correspondences which can contain a very large number of false correspondences, or outliers, in addition to a limited number of true correspondences (inliers). Next, we solve for correspondence by interpolating a vector field between the two point sets, which involves estimating a consensus of inlier points whose matching follows a nonparametric geometrical constraint. We formulate this a maximum a posteriori (MAP) estimation of a Bayesian model with hidden/latent variables indicating whether matches in the putative set are outliers or inliers. We impose nonparametric geometrical constraints on the correspondence, as a prior distribution, using Tikhonov regularizers in a reproducing kernel Hilbert space. MAP estimation is performed by the EM algorithm which by also estimating the variance of the prior model (initialized to a large value) is able to obtain good estimates very quickly (e. g., avoiding many of the local minima inherent in this formulation). We illustrate this method on data sets in 2D and 3D and demonstrate that it is robust to a very large number of outliers (even up to 90%). We also show that in the special case where there is an underlying parametric geometrical model (e. g., the epipolar line constraint) that we obtain better results than standard alternatives like RANSAC if a large number of outliers are present. This suggests a two-stage strategy, where we use our nonparametric model to reduce the size of the putative set and then apply a parametric variant of our approach to estimate the geometric parameters. Our algorithm is computationally efficient and we provide code for others to use it. In addition, our approach is general and can be applied to other problems, such as learning with a badly corrupted training data set.
引用
收藏
页码:1706 / 1721
页数:16
相关论文
共 68 条
[1]  
Alvarez MA, 2011, J MACH LEARN RES, V12, P1459
[2]  
[Anonymous], SLOW SMOOTH BAYESIAN
[3]  
[Anonymous], 2010, P 18 ACM INT C MULT
[4]  
[Anonymous], IEEE T PATTERN ANAL
[5]  
[Anonymous], RIGIDITY SMOOTHNESS
[6]  
[Anonymous], ADV LEARNING THEORY
[7]  
[Anonymous], 1977, Solution of illposed problems
[8]  
[Anonymous], ADV NEURAL INFORM PR
[9]  
[Anonymous], ADV NEURAL INFORM PR
[10]  
[Anonymous], 2010, PROC 27 INT C INT C