Enhancing Direct Georeferencing Using Real-Time Kinematic UAVs and Structure from Motion-Based Photogrammetry for Large-Scale Infrastructure

被引:1
作者
Han, Soohee [1 ]
Han, Dongyeob [2 ]
机构
[1] Kyungil Univ, Dept Geoinformat Engn, Gyongsan si 38428, South Korea
[2] Chonnam Natl Univ, Dept Civil Engn, Gwangju 61186, South Korea
关键词
RTK; UAV; constraint equation; SfM; oblique image; RTK-UAV;
D O I
10.3390/drones8120736
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
The growing demand for high-accuracy mapping and 3D modeling using unmanned aerial vehicles (UAVs) has accelerated advancements in flight dynamics, positioning accuracy, and imaging technology. Structure from motion (SfM), a computer vision-based approach, is increasingly replacing traditional photogrammetry through facilitating the automation of processes such as aerial triangulation (AT), terrain modeling, and orthomosaic generation. This study examines methods to enhance the accuracy of SfM-based AT through real-time kinematic (RTK) UAV imagery, focusing on large-scale infrastructure applications, including a dam and its entire basin. The target area, primarily consisting of homogeneous water surfaces, poses considerable challenges for feature point extraction and image matching, which are crucial for effective SfM. To overcome these challenges and improve the AT accuracy, a constraint equation was applied, incorporating weighted 3D coordinates derived from RTK UAV data. Furthermore, oblique images were combined with nadir images to stabilize AT, and confidence-based filtering was applied to point clouds to enhance geometric quality. The results indicate that assigning appropriate weights to 3D coordinates and incorporating oblique imagery significantly improve the AT accuracy. This approach presents promising advancements for RTK UAV-based AT in SfM-challenging, large-scale environments, thus supporting more efficient and precise mapping applications.
引用
收藏
页数:16
相关论文
共 41 条
[1]   Potentiality of high-resolution topographic survey using unmanned aerial vehicle in Bangladesh [J].
Ahmed, Raju ;
Mahmud, Khandakar Hasan .
REMOTE SENSING APPLICATIONS-SOCIETY AND ENVIRONMENT, 2022, 26
[2]   Cost-effective non-metric photogrammetry from consumer-grade sUAS: implications for direct georeferencing of structure from motion photogrammetry [J].
Carbonneau, Patrice E. ;
Dietrich, James T. .
EARTH SURFACE PROCESSES AND LANDFORMS, 2017, 42 (03) :473-486
[3]   UAV Photogrammetry in Intertidal Mudflats: Accuracy, Efficiency, and Potential for Integration with Satellite Imagery [J].
Chen, Chunpeng ;
Tian, Bo ;
Wu, Wenting ;
Duan, Yuanqiang ;
Zhou, Yunxuan ;
Zhang, Ce .
REMOTE SENSING, 2023, 15 (07)
[4]   GCP and PPK Utilization Plan to Deal with RTK Signal Interruption in RTK-UAV Photogrammetry [J].
Cho, Jung Min ;
Lee, Byoung Kil .
DRONES, 2023, 7 (04)
[5]  
Cho jungmin, 2021, [JOURNAL OF THE KOREAN SOCIETY OF SURVEY,GEODESY,PHOTOGRAMMETRY, AND CARTOGRAPHY, 한국측량학회지], V39, P41, DOI 10.7848/ksgpc.2021.39.1.41
[6]   Rockfall Analysis from UAV-Based Photogrammetry and 3D Models of a Cliff Area [J].
Cirillo, Daniele ;
Zappa, Michelangelo ;
Tangari, Anna Chiara ;
Brozzetti, Francesco ;
Ietto, Fabio .
DRONES, 2024, 8 (01)
[7]  
Cramer M., 2000, International Archives of Photogrammetry and Remote Sensing, V33, P198, DOI DOI 10.1017/CBO9780511777684
[8]   Assessment of Accuracy in Unmanned Aerial Vehicle (UAV) Pose Estimation with the REAL-Time Kinematic (RTK) Method on the Example of DJI Matrice 300 RTK [J].
Czyza, Szymon ;
Szuniewicz, Karol ;
Kowalczyk, Kamil ;
Dumalski, Andrzej ;
Ogrodniczak, Michal ;
Zieleniewicz, Lukasz .
SENSORS, 2023, 23 (04)
[9]   Accuracy assessment of real-time kinematics (RTK) measurements on unmanned aerial vehicles (UAV) for direct geo-referencing [J].
Ekaso, Desta ;
Nex, Francesco ;
Kerle, Norman .
GEO-SPATIAL INFORMATION SCIENCE, 2020, 23 (02) :165-181
[10]   Comparative performance analysis of precise point positioning technique in the UAV - based mapping [J].
Erol, Bihter ;
Turan, Ersin ;
Erol, Serdar ;
Kucak, Ramazan Alper .
MEASUREMENT, 2024, 233