FUSION OF OPTICAL AND TERRESTRIAL LASER SCANNER DATA

被引:0
作者
Li-Chee-Ming, Julien [1 ]
Armenakis, Costas [1 ]
机构
[1] York Univ, Geomat Engn, GeoICT Lab, Dept Earth & Space Sci & Engn, Toronto, ON M3J 2R7, Canada
来源
2010 CANADIAN GEOMATICS CONFERENCE AND SYMPOSIUM OF COMMISSION I, ISPRS CONVERGENCE IN GEOMATICS - SHAPING CANADA'S COMPETITIVE LANDSCAPE | 2010年 / 38卷
关键词
Terrestrial Laser Scanning; Photogrammetry; Sensor Registration; Point Texture Mapping;
D O I
暂无
中图分类号
P9 [自然地理学];
学科分类号
0705 ; 070501 ;
摘要
Optical imagery and range data can be registered to create photo-realistic scene models via texture mapping. Presented in this paper is an alternative approach where true colour (RBG) point clouds are generated by automatically fusing a close-range optical (RGB) image acquired with an uncalibrated digital camera with the corresponding high-density 3D lidar point cloud collected with a terrestrial laser scanner (TLS). The alignment of optical pixel colour values and lidar point cloud is obtained by estimating the position and orientation of the camera with respect to the lidar point cloud reference system. To perform this sensor co-registration, an automated corner feature extraction algorithm, followed by area-based image matching is applied between the optical data and the lidar intensity image to establish point correspondence. The matching process is solely based on point matches and does not use external control or calibration patterns. The 3D lidar points of the corresponding lidar intensity image corner points are then extracted from the point cloud. As these 3D lidar points correspond to the extracted optical image corner points, a bundle self-calibration adjustment with additional parameters is applied using the extended collinearity equations to estimate the interior and exterior orientation of the camera. The RANSAC robust estimator is used to reduce the influence of outliers in the estimation of the camera parameters. Having established the mathematical relationship between image space and lidar points a photo-realistic 3D model is generated. Through reverse mapping, each point in the lidar point cloud is assigned the RGB value of the image pixel upon which it is projected. Experiments are performed observing typical urban scenes, particularly building facades. The feasibility and potential of estimating the co-registration parameters using a TLS is evaluated in terms of accuracy of the results. The true calibration parameters, provided by the TLS manufacturer, are used in the validation of the registration parameters. The technique has reliably aligned a camera with the TLS geometry for the simultaneous generation of point based photo-realistic 3D models.
引用
收藏
页数:6
相关论文
共 10 条
  • [1] Abdel-Aziz YI, 1971, P S CLOS RANG PHOT, P1, DOI [DOI 10.1080/10671188.1967.10616517, DOI 10.14358/PERS.81.2.103]
  • [2] [Anonymous], 1981, RANDOM SAMPLE CONSEN
  • [3] [Anonymous], 1976, Observations and least squares
  • [4] [Anonymous], 1974, Solving least squares problems
  • [5] [Anonymous], 2000, Elements of Photogrammetry: With Applications in GIS
  • [6] [Anonymous], 1985, S AFRICAN J PHOTOGRA
  • [7] The principal point and CCD cameras
    Clarke, TA
    Wang, X
    Fryer, JG
    [J]. PHOTOGRAMMETRIC RECORD, 1998, 16 (92) : 293 - 312
  • [8] An automatic procedure for co-registration of terrestrial laser scanners and digital cameras
    Gonzalez-Aguilera, Diego
    Rodriguez-Gonzalvez, Pablo
    Gomez-Lahoz, Javier
    [J]. ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2009, 64 (03) : 308 - 316
  • [9] Lowe D. G., 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision, P1150, DOI 10.1109/ICCV.1999.790410
  • [10] Image registration methods:: a survey
    Zitová, B
    Flusser, J
    [J]. IMAGE AND VISION COMPUTING, 2003, 21 (11) : 977 - 1000