Method for 3-D Scene Reconstruction Using Fused LiDAR and Imagery From a Texel Camera

被引:17
作者
Bybee, Taylor C. [1 ]
Budge, Scott E. [1 ]
机构
[1] Utah State Univ, Dept Elect & Comp Engn, Ctr Adv Imaging LADAR, Logan, UT 84322 USA
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2019年 / 57卷 / 11期
关键词
Laser radar; Cameras; Image reconstruction; Aircraft navigation; Aircraft; Surface reconstruction; Bundle adjustment; image registration; LiDAR; multisensor systems; photogrammetry; remote sensing; DENSITY LIDAR; DATA FUSION; REGISTRATION; PHOTOGRAMMETRY; ORIENTATION;
D O I
10.1109/TGRS.2019.2923551
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Reconstructing a 3-D scene from aerial sensor data creating a textured digital surface model (TDSM), consisting of a LiDAR point cloud and an overlaid image, is valuable in many applications including agriculture, military, surveying, and natural disaster response. When collecting LiDAR from an aircraft, the navigation system accuracy must exceed the LiDAR accuracy to properly reference returns in 3-D space. Precision navigation systems can be expensive and often require full-scale aircraft to house such systems. Synchronizing the LiDAR sensor and a camera, using a texel camera calibration, provides additional information that reduces the need for precision navigation equipment. This paper describes a bundle adjustment technique for aerial texel images that allows for relatively low-accuracy navigation systems to be used with low-cost LiDAR and camera data to form higher fidelity terrain models. The bundle adjustment objective function utilizes matching image points, measured LiDAR distances, and the texel camera calibration and does not require overlapping LiDAR scans or ground control points. The utility of this method is proven using a simulated texel camera and unmanned aerial system (UAS) flight data created from aerial photographs and elevation data. A small UAS is chosen as the target vehicle due to its relatively inexpensive hardware and operating costs, illustrating the power of this method in accurately referencing the LiDAR and camera data. In the 3-D reconstruction, the 1-$\sigma $ accuracy between LiDAR measurements across the scene is on the order of the digital camera pixel size.
引用
收藏
页码:8879 / 8889
页数:11
相关论文
共 51 条
[1]  
Agouris P, 1996, PHOTOGRAMM ENG REM S, V62, P703
[2]   Accuracy of Digital Surface Models and Orthophotos Derived from Unmanned Aerial Vehicle Photogrammetry [J].
Aguera-Vega, Francisco ;
Carvajal-Ramirez, Fernando ;
Martinez-Carricondo, Patricio .
JOURNAL OF SURVEYING ENGINEERING, 2017, 143 (02)
[3]  
Ahmad A., 2011, P 11 S E AS SURV C 1
[4]  
Badino H., 2011, 2011 International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission (3DIMPVT), P405, DOI 10.1109/3DIMPVT.2011.58
[5]   A comparison between photogrammetry and laser scanning [J].
Baltsavias, EP .
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 1999, 54 (2-3) :83-94
[6]   SURF: Speeded up robust features [J].
Bay, Herbert ;
Tuytelaars, Tinne ;
Van Gool, Luc .
COMPUTER VISION - ECCV 2006 , PT 1, PROCEEDINGS, 2006, 3951 :404-417
[7]   A METHOD FOR REGISTRATION OF 3-D SHAPES [J].
BESL, PJ ;
MCKAY, ND .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1992, 14 (02) :239-256
[8]   The normal distributions transform: A new approach to laser scan matching [J].
Biber, P .
IROS 2003: PROCEEDINGS OF THE 2003 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2003, :2743-2748
[9]   Hyperspectral and Lidar Intensity Data Fusion: A Framework for the Rigorous Correction of Illumination, Anisotropic Effects, and Cross Calibration [J].
Brell, Maximilian ;
Segl, Karl ;
Guanter, Luis ;
Bookhagen, Bodo .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2017, 55 (05) :2799-2810
[10]   Improving Sensor Fusion: A Parametric Method for the Geometric Coalignment of Airborne Hyperspectral and Lidar Data [J].
Brell, Maximilian ;
Rogass, Christian ;
Segl, Karl ;
Bookhagen, Bodo ;
Guanter, Luis .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2016, 54 (06) :3460-3474