LOW COST AND EFFICIENT 3D INDOOR MAPPING USING MULTIPLE CONSUMER RGB-D CAMERAS

被引:4
作者
Chen, C. [1 ]
Yang, B. S. [1 ]
Song, S. [1 ]
机构
[1] Wuhan Univ, State Key Lab Informat Engn Survey Mapping & Remo, 129 Luoyu Rd, Wuhan, Peoples R China
来源
XXIII ISPRS CONGRESS, COMMISSION I | 2016年 / 41卷 / B1期
关键词
Indoor Mapping; RGB-D Camera; Kinect; Calibration; VISUAL ODOMETRY;
D O I
10.5194/isprsarchives-XLI-B1-169-2016
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
Driven by the miniaturization, lightweight of positioning and remote sensing sensors as well as the urgent needs for fusing indoor and outdoor maps for next generation navigation, 3D indoor mapping from mobile scanning is a hot research and application topic. The point clouds with auxiliary data such as colour, infrared images derived from 3D indoor mobile mapping suite can be used in a variety of novel applications, including indoor scene visualization, automated floorplan generation, gaming, reverse engineering, navigation, simulation and etc. State-of-the-art 3D indoor mapping systems equipped with multiple laser scanners product accurate point clouds of building interiors containing billions of points. However, these laser scanner based systems are mostly expensive and not portable. Low cost consumer RGB-D Cameras provides an alternative way to solve the core challenge of indoor mapping that is capturing detailed underlying geometry of the building interiors. Nevertheless, RGB-D Cameras have a very limited field of view resulting in low efficiency in the data collecting stage and incomplete dataset that missing major building structures (e.g. ceilings, walls). Endeavour to collect a complete scene without data blanks using single RGB-D Camera is not technic sound because of the large amount of human labour and position parameters need to be solved. To fmd an efficient and low cost way to solve the 3D indoor mapping, in this paper, we present an indoor mapping suite prototype that is built upon a novel calibration method which calibrates internal parameters and external parameters of multiple RGB-D Cameras. Three Kinect sensors are mounted on a rig with different view direction to form a large field of view. The calibration procedure is three folds: 1, the internal parameters of the colour and infrared camera inside each Kinect are calibrated using a chess board pattern, respectively; 2, the external parameters between the colour and infrared camera inside each Kinect are calibrated using a chess board pattern; 3, the external parameters between every Kinect are firstly calculated using a pre-set calibration field and further refined by an iterative closet point algorithm. Experiments are carried out to validate the proposed method upon RGB-D datasets collected by the indoor mapping suite prototype. The effectiveness and accuracy of the proposed method is evaluated by comparing the point clouds derived from the prototype with ground truth data collected by commercial terrestrial laser scanner at ultra-high density. The overall analysis of the results shows that the proposed method achieves seamless integration of multiple point clouds form different RGB-D cameras collected at 30 frame per second.
引用
收藏
页码:169 / 174
页数:6
相关论文
共 20 条
  • [1] [Anonymous], 2014, RGBD MAPPING USING D
  • [2] [Anonymous], 2011, INT S ROB RES
  • [3] Depth-Color Fusion Strategy for 3-D Scene Modeling With Kinect
    Camplani, Massimo
    Mantecon, Tomas
    Salgado, Luis
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2013, 43 (06) : 1560 - 1571
  • [4] A metrological characterization of the Kinect V2 time-of-flight camera
    Corti, Andrea
    Giancola, Silvio
    Mainetti, Giacomo
    Sala, Remo
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2016, 75 : 584 - 594
  • [5] Fankhauser P, 2015, PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS (ICAR), P388, DOI 10.1109/ICAR.2015.7251485
  • [6] Living with robots: Interactive environmental knowledge acquisition
    Gemignani, Guglielmo
    Capobianco, Roberto
    Bastianelli, Emanuele
    Bloisi, Domenico Daniele
    Iocchi, Luca
    Nardi, Daniele
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2016, 78 : 1 - 16
  • [7] Analysis and Evaluation Between the First and the Second Generation of RGB-D Sensors
    Gesto Diaz, Manuel
    Tombari, Federico
    Rodriguez-Gonzalvez, Pablo
    Gonzalez-Aguilera, Diego
    [J]. IEEE SENSORS JOURNAL, 2015, 15 (11) : 6507 - 6516
  • [8] Gui Popo, 2014, 2014 Third International Workshop on Earth Observation and Remote Sensing Applications Proceedings of (EORSA), P9, DOI 10.1109/EORSA.2014.6927839
  • [9] Dense RGB-D visual odometry using inverse depth
    Gutierrez-Gomez, Daniel
    Mayol-Cuevas, Walterio
    Guerrero, J. J.
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2016, 75 : 571 - 583
  • [10] Modeling and correction of multipath interference in time of flight cameras
    Jimenez, David
    Pizarro, Daniel
    Mazo, Manuel
    Palazuelos, Sira
    [J]. IMAGE AND VISION COMPUTING, 2014, 32 (01) : 1 - 13