Corner Matching Refinement for Monocular Pose Estimation

被引:0
作者
Gamage, Dinesh [1 ]
Drummond, Tom [1 ]
机构
[1] Monash Univ, Clayton, Vic 3800, Australia
来源
PROCEEDINGS OF THE BRITISH MACHINE VISION CONFERENCE 2012 | 2012年
关键词
PHASE; ALGORITHMS; MODEL; DEPTH;
D O I
10.5244/C.26.38
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many tasks in computer vision rely on accurate detection and matching of visual landmarks (e.g. image corners) between two images. In particular, for the calculation of epipolar geometry from a minimal set of five correspondences the spatial accuracy of matched landmarks is critical because the result is very sensitive to errors. The most common way of improving the accuracy is to calculate a sub-pixel location independently for each landmark in the hope that this reduces the re-projection error of the point in space to which they refer. This paper presents a method for refining the coordinates of correspondences directly. Thus given some coordinates in the first image, our goal is to maximise the accuracy of the estimate of the coordinates in second image corresponding to the same real world point without being too concerned about which real world point is being matched. We show how this can be achieved as a frequency domain optimisation between two image patches to refine the correspondence by estimating affine parameters. We select the correct frequency range for optimisation by identifying a direct relationship between the Gabor phase based approach and the frequency response of a patch. Further, we show how parametric estimation can be made accurate by operating in the frequency domain. Finally, we present experiments which demonstrate the accuracy of this approach, its robustness to changes in scale and orientation and its superior performance by comparison to other sub-pixel methods.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] 6DoF object pose measurement by a monocular manifold-based pattern recognition technique
    Kouskouridas, Rigas
    Charalampous, Konstantinos
    Gasteratos, Antonios
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2012, 23 (11)
  • [42] Pose estimation-based path planning for a tracked mobile robot traversing uneven terrains
    Jun, Jae-Yun
    Saut, Jean-Philippe
    Benamar, Faiz
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2016, 75 : 325 - 339
  • [43] Segment-Based Disparity Refinement With Occlusion Handling for Stereo Matching
    Yan, Tingman
    Gan, Yangzhou
    Xia, Zeyang
    Zhao, Qunfei
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (08) : 3885 - 3897
  • [44] GRiD: Guided Refinement for Detector-Free Multimodal Image Matching
    Liu, Yuyan
    He, Wei
    Zhang, Hongyan
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 5892 - 5906
  • [45] Stereopsis based on monocular gaps: Metrical encoding of depth and slant without matching contours
    Gillam, B
    Blackburn, S
    Nakayama, K
    VISION RESEARCH, 1999, 39 (03) : 493 - 502
  • [46] Distance Estimation from a Monocular Camera Using Face and Body Features
    Duman, Sonay
    Elewi, Abdullah
    Yetgin, Zeki
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2022, 47 (02) : 1547 - 1557
  • [47] Head pose estimation with uncertainty and an application to dyadic interaction detection
    Tomenotti, Federico Figari
    Noceti, Nicoletta
    Odone, Francesca
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2024, 243
  • [48] Grasping Pose Estimation for Robots Based on Convolutional Neural Networks
    Zheng, Tianjiao
    Wang, Chengzhi
    Wan, Yanduo
    Zhao, Sikai
    Zhao, Jie
    Shan, Debin
    Zhu, Yanhe
    MACHINES, 2023, 11 (10)
  • [49] A Top-down Perception Approach for Vehicle Pose Estimation
    Bernay-Angeletti, Coralie
    Chabot, Florian
    Aynaud, Claude
    Aufrere, Romuald
    Chapuis, Roland
    2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2015, : 2240 - 2245
  • [50] Pose Estimation Using Local Binary Patterns for Face Recognition
    Nhat-Quan Huynh Nguyen
    Thai Hoang Le
    ARTIFICIAL INTELLIGENCE PERSPECTIVES AND APPLICATIONS (CSOC2015), 2015, 347 : 39 - 49