Fusing LiDAR and Photogrammetry for Accurate 3D Data: A Hybrid Approach

被引:0
|
作者
Maskeliunas, Rytis [1 ]
Maqsood, Sarmad [1 ]
Vaskevicius, Mantas [2 ]
Gelsvartas, Julius [2 ]
机构
[1] Kaunas Univ Technol, Fac Informat, Ctr Real Time Comp Syst, LT-51386 Kaunas, Lithuania
[2] Matomai UAB, LT-51423 Kaunas, Lithuania
关键词
LiDAR; photogrammetry; point cloud fusion; machine learning; augmented reality; AUGMENTED REALITY; POINT CLOUD; VISUALIZATION;
D O I
10.3390/rs17030443
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
The fusion of LiDAR and photogrammetry point clouds is a necessary advancement in 3D-modeling, enabling more comprehensive and accurate representations of physical environments. The main contribution of this paper is the development of an innovative fusion system that combines classical algorithms, such as Structure from Motion (SfM), with advanced machine learning techniques, like Coherent Point Drift (CPD) and Feature-Metric Registration (FMR), to improve point cloud alignment and fusion. Experimental results, using a custom dataset of real-world scenes, demonstrate that the hybrid fusion method achieves an average error of less than 5% in the measurements of small reconstructed objects, with large objects showing less than 2% deviation from real sizes. The fusion process significantly improved structural continuity, reducing artifacts like edge misalignments. The k-nearest neighbors (kNN) analysis showed high reconstruction accuracy for the hybrid approach, demonstrating that the hybrid fusion system, particularly when combining machine learning-based refinement with traditional alignment methods, provides a notable advancement in both geometric accuracy and computational efficiency for real-time 3D-modeling applications.
引用
收藏
页数:27
相关论文
共 50 条
  • [1] 3D City Models completion by Fusing Lidar and Image Data
    Grammatikopoulos, L.
    Kalisperakis, I.
    Petsa, E.
    Stentoumis, C.
    VIDEOMETRICS, RANGE IMAGING, AND APPLICATIONS XIII, 2015, 9528
  • [2] Fusing semantic labeled camera images and 3D LiDAR data for the detection of urban curbs
    Goga, Selma Evelyn Catalina
    Nedevschi, Sergiu
    2018 IEEE 14TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING (ICCP), 2018, : 301 - 308
  • [3] Construction of 3D Environment Models by Fusing Ground and Aerial Lidar Point Cloud Data
    Langerwisch, Marco
    Kraemer, Marc Steven
    Kuhnert, Klaus-Dieter
    Wagner, Bernardo
    INTELLIGENT AUTONOMOUS SYSTEMS 13, 2016, 302 : 473 - 485
  • [4] 3D Lidar Data Segmentation Using a Sequential Hybrid Method
    Tuncer, Mehmet Ali Cagri
    Schulz, Dirk
    INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, ICINCO 2017, 2020, 495 : 513 - 535
  • [5] Fusing Large Volumes of Range and Image Data for Accurate Description of Realistic 3D Scenes
    Chan, Yuk Hin
    Delmas, Patrice
    Gimel'farb, Georgy
    Valkenburg, Robert
    ADVANCED CONCEPTS FOR INTELLIGENT VISION SYSTEMS, PT I, 2010, 6474 : 332 - +
  • [6] Automated mapping of rock discontinuities in 3D lidar and photogrammetry models
    Lato, Matthew J.
    Voge, Malte
    INTERNATIONAL JOURNAL OF ROCK MECHANICS AND MINING SCIENCES, 2012, 54 : 150 - 158
  • [7] Comparative Analysis of LiDAR and Photogrammetry for 3D Crime Scene Reconstruction
    Sheshtar, Fatemah M.
    Alhatlani, Wajd M.
    Moulden, Michael
    Kim, Jong Hyuk
    APPLIED SCIENCES-BASEL, 2025, 15 (03):
  • [8] Design of 3D reconstruction system on quadrotor Fusing LiDAR and camera
    Zhao, Pinjie
    Li, Rui
    Shi, Yingjing
    He, Liang
    PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 3984 - 3989
  • [9] 3D Scene Reconstruction Using Omnidirectional Vision and LiDAR: A Hybrid Approach
    Vlaminck, Michiel
    Luong, Hiep
    Goeman, Werner
    Philips, Wilfried
    SENSORS, 2016, 16 (11)
  • [10] Creation of accurate 3D models of harbor porpoises (Phocoena phocoena) using 3D photogrammetry
    Irschick, Duncan J.
    Martin, Johnson
    Siebert, Ursula
    Kristensen, Jakob H.
    Madsen, Peter T.
    Christiansen, Fredrik
    MARINE MAMMAL SCIENCE, 2021, 37 (02) : 482 - 491