Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling

被引:26
作者
Tang, Shengjun [1 ,2 ,3 ,4 ,5 ]
Zhu, Qing [1 ,2 ,3 ,4 ]
Chen, Wu [5 ]
Darwish, Walid [5 ]
Wu, Bo [5 ]
Hu, Han [3 ]
Chen, Min [3 ]
机构
[1] Wuhan Univ, State Key Lab Informat Engn Surveying Mapping & R, 129 Luoyu Rd, Wuhan 430079, Peoples R China
[2] State Prov Joint Engn Lab Spatial Informat Techno, Chengdu 610031, Peoples R China
[3] Southwest Jiaotong Univ, Fac Geosci & Environm Engn, Chengdu 610031, Peoples R China
[4] Collaborat Innovat Ctr Geospatial Techneol, 129 Luoyu Rd, Wuhan 430079, Peoples R China
[5] Hong Kong Polytech Univ, Dept Land Surveying & Geoinformat, Hong Kong 999077, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
indoor modeling; RGB-D camera; depth; image; camera pose; registration; REGISTRATION; DISTANCE;
D O I
10.3390/s16101589
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
RGB-D sensors (sensors with RGB camera and Depth camera) are novel sensing systems that capture RGB images along with pixel-wise depth information. Although they are widely used in various applications, RGB-D sensors have significant drawbacks including limited measurement ranges (e.g., within 3 m) and errors in depth measurement increase with distance from the sensor with respect to 3D dense mapping. In this paper, we present a novel approach to geometrically integrate the depth scene and RGB scene to enlarge the measurement distance of RGB-D sensors and enrich the details of model generated from depth images. First, precise calibration for RGB-D Sensors is introduced. In addition to the calibration of internal and external parameters for both, IR camera and RGB camera, the relative pose between RGB camera and IR camera is also calibrated. Second, to ensure poses accuracy of RGB images, a refined false features matches rejection method is introduced by combining the depth information and initial camera poses between frames of the RGB-D sensor. Then, a global optimization model is used to improve the accuracy of the camera pose, decreasing the inconsistencies between the depth frames in advance. In order to eliminate the geometric inconsistencies between RGB scene and depth scene, the scale ambiguity problem encountered during the pose estimation with RGB image sequences can be resolved by integrating the depth and visual information and a robust rigid-transformation recovery method is developed to register RGB scene to depth scene. The benefit of the proposed joint optimization method is firstly evaluated with the publicly available benchmark datasets collected with Kinect. Then, the proposed method is examined by tests with two sets of datasets collected in both outside and inside environments. The experimental results demonstrate the feasibility and robustness of the proposed method.
引用
收藏
页数:22
相关论文
共 44 条
[1]  
Ahmed MT, 2010, VISAPP 2010: PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON COMPUTER VISION THEORY AND APPLICATIONS, VOL 1, P231
[2]  
[Anonymous], 2011, P RGB D WORKSH 3D PE
[3]  
[Anonymous], 2011, P INT S ROB RES ISRR
[4]   An optimal algorithm for approximate nearest neighbor searching in fixed dimensions [J].
Arya, S ;
Mount, DM ;
Netanyahu, NS ;
Silverman, R ;
Wu, AY .
JOURNAL OF THE ACM, 1998, 45 (06) :891-923
[5]   A METHOD FOR REGISTRATION OF 3-D SHAPES [J].
BESL, PJ ;
MCKAY, ND .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1992, 14 (02) :239-256
[6]  
BROWN DC, 1971, PHOTOGRAMM ENG, V37, P855
[7]  
Chiuso A., 2000, ECCV, P734
[8]   IMU and Multiple RGB-D Camera Fusion for Assisting Indoor Stop-and-Go 3D Terrestrial Laser Scanning [J].
Chow, Jacky C. K. ;
Lichti, Derek D. ;
Hol, Jeroen D. ;
Bellusci, Giovanni ;
Luinge, Henk .
ROBOTICS, 2014, 3 (03) :247-280
[9]   Mapping Indoor Spaces by Adaptive Coarse-to-Fine Registration of RGB-D Data [J].
dos Santos, Daniel R. ;
Basso, Marcos A. ;
Khoshelham, Kourosh ;
de Oliveira, Elizeu, Jr. ;
Pavan, Nadisson L. ;
Vosselman, George .
IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2016, 13 (02) :262-266
[10]  
Dryanovski I, 2013, IEEE INT CONF ROBOT, P2305, DOI 10.1109/ICRA.2013.6630889