CorrI2P: Deep Image-to-Point Cloud Registration via Dense Correspondence

被引:26
|
作者
Ren, Siyu [1 ]
Zeng, Yiming [1 ]
Hou, Junhui [1 ]
Chen, Xiaodong [2 ]
机构
[1] City Univ Hong Kong, City Univ Hong Kong Shenzhen Res Inst, Dept Comp Sci, Hong Kong 518057, Peoples R China
[2] Tianjin Univ, Sch Precis Instrument & Optoelect Engn, Tianjin 300072, Peoples R China
关键词
Point cloud compression; Feature extraction; Cameras; Three-dimensional displays; Detectors; Feeds; Visualization; Point cloud; registration; cross-modality; correspondence; deep learning;
D O I
10.1109/TCSVT.2022.3208859
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Motivated by the intuition that the critical step of localizing a 2D image in the corresponding 3D point cloud is establishing 2D-3D correspondence between them, we propose the first feature-based dense correspondence framework for addressing the challenging problem of 2D image-to-3D point cloud registration, dubbed CorrI2P. CorrI2P is mainly composed of three modules, i.e., feature embedding, symmetric overlapping region detection, and pose estimation through the established correspondence. Specifically, given a pair of a 2D image and a 3D point cloud, we first transform them into high-dimensional feature spaces and feed the resulting features into a symmetric overlapping region detector to determine the region where the image and point cloud overlap. Then we use the features of the overlapping regions to establish dense 2D-3D correspondence, on which EPnP within RANSAC is performed to estimate the camera pose, i.e., translation and rotation matrices. Experimental results on KITTI and NuScenes datasets show that our CorrI2P outperforms state-of-the-art image-to-point cloud registration methods significantly. The code will be publicly available at https://github.com/rsy6318/CorrI2P.
引用
收藏
页码:1198 / 1208
页数:11
相关论文
共 39 条
  • [1] DeepI2P: Image-to-Point Cloud Registration via Deep Classification
    Li, Jiaxin
    Lee, Gim Hee
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 15955 - 15964
  • [2] Image-to-Point Registration via Cross-Modality Correspondence Retrieval
    Bie, Lin
    Li, Siqi
    Cheng, Kai
    PROCEEDINGS OF THE 4TH ANNUAL ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2024, 2024, : 266 - 274
  • [3] Quantity-Aware Coarse-to-Fine Correspondence for Image-to-Point Cloud Registration
    Yao, Gongxin
    Xuan, Yixin
    Chen, Yiwei
    Pan, Yu
    IEEE SENSORS JOURNAL, 2024, 24 (20) : 33826 - 33837
  • [4] AgentI2P: Optimizing Image-to-Point Cloud Registration via Behaviour Cloning and Reinforcement Learning
    Yan, Shen
    Zhang, Maojun
    Peng, Yang
    Liu, Yu
    Tan, Hanlin
    REMOTE SENSING, 2022, 14 (24)
  • [5] EFGHNet: A Versatile Image-to-Point Cloud Registration Network for Extreme Outdoor Environment
    Jeon, Yurim
    Seo, Seung-Woo
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (03) : 7511 - 7517
  • [6] CurrI2P: inter- and intra-modality similarity curriculum learning for image-to-point cloud registration
    Lin, Liwei
    Lin, Chunyu
    Nie, Lang
    Huang, Shujuan
    Zhao, Yao
    VISUAL COMPUTER, 2025,
  • [7] Colmap-PCD: An Open-source Tool for Fine Image-to-point cloud Registration
    Bai, Chunge
    Fu, Ruijie
    Gao, Xiang
    2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2024, 2024, : 1723 - 1729
  • [8] Rigid point cloud registration based on correspondence cloud for image-to-patient registration in image-guided surgery
    Li, Zhihao
    Wang, Manning
    MEDICAL PHYSICS, 2024, 51 (07) : 4554 - 4566
  • [9] Low-Overlap Point Cloud Registration via Correspondence Augmentation
    Lin, Zhi-Huang
    Zhang, Chun-Yang
    Lin, Xue-Ming
    Lin, Huibin
    Zeng, Gui-Huang
    Chen, C. L. Philip
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2024,
  • [10] Deep Correspondence Matching Based Robust Point Cloud Registration of Profiled Parts
    Peng, Weixing
    Wang, Yaonan
    Zhang, Hui
    Cao, Yihong
    Zhao, Jiawen
    Jiang, Yiming
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (02) : 2129 - 2143