Real-Scale 3-D Reconstruction With Monocular Zoom Technology

被引:0
|
作者
Song, Jinao [1 ]
Li, Jie [1 ]
Fan, Hao [1 ]
Qi, Lin [1 ]
Zhang, Shu [1 ]
Chen, Yong [1 ]
Dong, Junyu [1 ]
机构
[1] Ocean Univ China, Dept Informat Sci & Technol, Qingdao 266000, Peoples R China
基金
中国国家自然科学基金;
关键词
Three-dimensional displays; Accuracy; Structure from motion; Design methodology; Cameras; Image restoration; Image reconstruction; Monocular zooming; optical flow; real-scale 3-D reconstruction;
D O I
10.1109/TIM.2024.3497052
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We propose a method that is able to use the monocular zoom technology for real-scale 3-D reconstruction of the scene. To reconstruct the scene, we take a sequence of zoomed-in and zoomed-out figures. First, we can estimate zoomed-in camera parameters using the known zoomed-out camera parameters, which avoids calibrating the camera parameters twice. Then, we use the structure from motion (SfM) method (COLMAP) to reconstruct free-scale translations among these figures. After that, as we have pairs of zoom frames in the same scene, we can calculate the true scale of the scene by comparing the ratio between the free-scale translation of a pair of zoom frames and the difference in zoomed-out and the zoomed-in focal length. Finally, we use RAFT-stereo to compute the depth of the scene. In detail, we select two adjacent figures taken at the same focal length, make a stereo correction for them, and remove the nonco-vision area of the corrected images. This way, we obtain a more accurate matching of these images and then get a dense real-scale 3-D reconstruction. Experimental results have demonstrated that our method achieves good performance on monocular 3-D reconstruction with the real scale.
引用
收藏
页数:8
相关论文
共 50 条
  • [31] Exposure Map Fusion for Precise 3-D Reconstruction of High Dynamic Range Surfaces
    Li, Ji
    Guan, Jingtian
    Chen, Xiaobo
    Le, Xinyi
    Xi, Juntong
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [32] Coarse-to-Fine Sparse 3-D Reconstruction in THz Light Field Imaging
    Kutaish, Abdulraouf
    Conde, Miguel Heredia
    Pfeiffer, Ullrich
    IEEE SENSORS LETTERS, 2024, 8 (10)
  • [33] Highly Applicable Iterative Network for 3-D Reconstruction Based on Multiview Satellite Images
    Hong, Zhonghua
    Yang, Peixin
    Pan, Haiyan
    Zhou, Ruyan
    Zhang, Yun
    Han, Yanling
    Wang, Jing
    Yang, Shuhu
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 : 10338 - 10351
  • [34] Binocular-Vision-Based Structure From Motion for 3-D Reconstruction of Plants
    Peng, Yeping
    Yang, Mingbin
    Zhao, Genping
    Cao, Guangzhong
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [35] Real 3-D flow fields reconstruction by interferometric volume computerized tomography (IVCT)
    Sun, Nan
    Song, Yang
    Li, Zhen-hua
    He, An-zhi
    OPTICS COMMUNICATIONS, 2013, 294 : 129 - 133
  • [36] Global Color Consistency Correction for Large-Scale Images in 3-D Reconstruction
    Li, Yunmeng
    Li, Yinxuan
    Yao, Jian
    Gong, Ye
    Li, Li
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2022, 15 : 3074 - 3088
  • [37] HiPER 3-D: An Omnidirectional Sensor for High Precision Environmental 3-D Reconstruction
    Marino, Francescomaria
    De Ruvo, Pasquale
    De Ruvo, Gianluigi
    Nitti, Massimiliano
    Stella, Ettore
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2012, 59 (01) : 579 - 591
  • [38] Monocular 3D Reconstruction of Locally Textured Surfaces
    Varol, Aydin
    Shaji, Appu
    Salzmann, Mathieu
    Fua, Pascal
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2012, 34 (06) : 1118 - 1130
  • [39] The 3-D Interconnect Technology Landscape
    Beyne, Eric
    IEEE DESIGN & TEST, 2016, 33 (03) : 8 - 20
  • [40] Accurate 3-D Positioning of Aircraft Based on Laser Rangefinder Combining Monocular Vision Measurement
    Liu, Mingkun
    Feng, Guangkun
    Liu, Fulin
    Wei, Zhenzhong
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73