Corner Matching Refinement for Monocular Pose Estimation

被引:0
作者
Gamage, Dinesh [1 ]
Drummond, Tom [1 ]
机构
[1] Monash Univ, Clayton, Vic 3800, Australia
来源
PROCEEDINGS OF THE BRITISH MACHINE VISION CONFERENCE 2012 | 2012年
关键词
PHASE; ALGORITHMS; MODEL; DEPTH;
D O I
10.5244/C.26.38
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many tasks in computer vision rely on accurate detection and matching of visual landmarks (e.g. image corners) between two images. In particular, for the calculation of epipolar geometry from a minimal set of five correspondences the spatial accuracy of matched landmarks is critical because the result is very sensitive to errors. The most common way of improving the accuracy is to calculate a sub-pixel location independently for each landmark in the hope that this reduces the re-projection error of the point in space to which they refer. This paper presents a method for refining the coordinates of correspondences directly. Thus given some coordinates in the first image, our goal is to maximise the accuracy of the estimate of the coordinates in second image corresponding to the same real world point without being too concerned about which real world point is being matched. We show how this can be achieved as a frequency domain optimisation between two image patches to refine the correspondence by estimating affine parameters. We select the correct frequency range for optimisation by identifying a direct relationship between the Gabor phase based approach and the frequency response of a patch. Further, we show how parametric estimation can be made accurate by operating in the frequency domain. Finally, we present experiments which demonstrate the accuracy of this approach, its robustness to changes in scale and orientation and its superior performance by comparison to other sub-pixel methods.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Joint Albedo Estimation and Pose Tracking from Video
    Taheri, Sima
    Sankaranarayanan, Aswin C.
    Chellappa, Rama
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (07) : 1674 - 1689
  • [32] Head Pose Estimation Based on Multivariate Label Distribution
    Geng, Xin
    Qian, Xin
    Huo, Zengwei
    Zhang, Yu
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (04) : 1974 - 1991
  • [33] Camera marker networks for articulated machine pose estimation
    Feng, Chen
    Kamat, Vineet R.
    Cai, Hubo
    AUTOMATION IN CONSTRUCTION, 2018, 96 : 148 - 160
  • [34] Probabilistic Distance Estimation for Vehicle Tracking Application in Monocular Vision
    Lessmann, Stephanie
    Meuter, Mirko
    Mueller, Dennis
    Pauli, Josef
    2016 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), 2016, : 1199 - 1204
  • [35] Robust Automatic Monocular Vehicle Speed Estimation for Traffic Surveillance
    Revaud, Jerome
    Humenberger, Martin
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 4531 - 4541
  • [36] Overview of 3D Human Pose Estimation
    Lin, Jianchu
    Li, Shuang
    Qin, Hong
    Wang, Hongchang
    Cui, Ning
    Jiang, Qian
    Jian, Haifang
    Wang, Gongming
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2023, 134 (03): : 1621 - 1651
  • [37] Human Pose Estimation and Object Interaction for Sports Behaviour
    Arif, Ayesha
    Ghadi, Yazeed Yasin
    Alarfaj, Mohammed
    Jalal, Ahmad
    Kamal, Shaharyar
    Kim, Dong-Seong
    CMC-COMPUTERS MATERIALS & CONTINUA, 2022, 72 (01): : 1 - 18
  • [38] Efficient decoupled pose estimation from a set of points
    Tahri, Omar
    Araujo, Helder
    Mezouar, Youcef
    Chaumette, Francois
    2013 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2013, : 1608 - 1613
  • [39] Efficient Iterative Pose Estimation using an Invariant to Rotations
    Tahri, Omar
    Araujo, Helder
    Mezouar, Youcef
    Chaumette, Francois
    IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (02) : 199 - 207
  • [40] User Pose Estimation based on multiple depth sensors
    Baek, Seongmin
    Kim, Myunggyu
    SIGGRAPH ASIA 2017 POSTERS (SA'17), 2017,