Visual-based localization methods for unmanned aerial vehicles in landing operation on maritime vessel

被引:0
作者
Nguyen, Tien-Thanh [1 ]
Hamesse, Charles [2 ,3 ]
Dutrannois, Thomas [1 ]
Halleux, Timothy [1 ]
De Cubber, Geert [1 ]
Haelterman, Rob [2 ]
Janssens, Bart [1 ]
机构
[1] Royal Mil Acad, Dept Mech, Brussels, Belgium
[2] Royal Mil Acad, Dept Math, Brussels, Belgium
[3] Univ Ghent, imec, IPI, URC, Ghent, Belgium
来源
ACTA IMEKO | 2024年 / 13卷 / 04期
关键词
UAV; maritime; synthetic data; detection; tracking; SLAM; VERSATILE; ONLINE; SLAM;
D O I
10.21014/actaimeko.v13i4.1575
中图分类号
TH7 [仪器、仪表];
学科分类号
0804 ; 080401 ; 081102 ;
摘要
Unmanned Aerial Vehicles (UAVs) have become increasingly important in maritime operations. However, accurate localization of these UAVs in maritime environments especially during landing operation on maritime vessels remains a challenge, particularly in GNSS denied areas. This paper proposes two visual-based localization methods for UAVs during two different phases of landing operating on maritime vessels. The first method is used for estimating the UAV's position with respect to the vessel during the approach phase. It involves a visual UAV detection and tracking approach using the YOLO detector and OceanPlus tracker trained on a custom dataset. The UAV's position with respect to the vessel is estimated using stereo triangulation. The proposed method achieves accurate positioning with errors below 10cm during landing phases in a simulated environment. The second method is used for final landing phase. We utilize a visual Simultaneous Localization and Mapping (SLAM) algorithm, ORB-SLAM3, for real-time motion estimation of a UAV with respect to its confined landing area on a maritime platform. ORB-SLAM3 was benchmarked against multiple state-of-the-art visual SLAM and Visual Odometry (VO) algorithms and evaluated for a simulated landing scenario of a UAV at 16m height with a downward camera. The results demonstrated sufficient speed and accuracy for the landing task. These methods provide a promising solution for precise and reliable localization of UAV in different phases of the landing operation on maritime vessel, especially in GNSS denied environments. Dataset and source codes can be accessed from: https://gitlab.cylab.be/t.nguyen/uav-visual-localization.
引用
收藏
页码:10 / 13
页数:4
相关论文
共 33 条
  • [1] Lee E., Yoon H., Park B., Kim E., Relative Precise Positioning based on Moving Baseline and the Effect of Uncommon Satellite Combination, 21st Int. Conf. on Control, Automation and Systems (ICCAS), pp. 162-166, (2021)
  • [2] Abbas S. M., Aslam S., Berns K., Muhammad A., Analysis and Improvements in AprilTag Based State Estimation, Sensors, 19, 24, (2019)
  • [3] Xiao Y., Tian Zh., Yu J., Zhang Y., Liu Sh., Du Sh., Lan X., A review of object detection based on deep learning, Multimedia Tools and Applications, 79, pp. 23729-23791, (2020)
  • [4] Li K., Wan G., Cheng G., Meng L., Han J., Object detection in optical remote sensing images: A survey and a new benchmark, ISPRS, 159, pp. 296-307, (2020)
  • [5] Samaras S., Diamantidou E., Ataloglou D., Deep learning on multi sensor data for counter UAV applications—a systematic review, Sensors, 19, 22, (2019)
  • [6] Seidaliyeva U., Akhmetov D., Ilipbayeva L., Matson E. T., Realtime and accurate drone detection in a video with a static background, Sensors, 20, 14, (2020)
  • [7] Ren S., He K., Girshick R., Sun J., Faster R-CNN: towards real-time object detection with region proposal networks, CoRR, (2015)
  • [8] Redmon J., Divvala S., Girshick R., Farhadi A., You only look once: Unified, real-time object detection, Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition, (2016)
  • [9] Jocher G., Ultralytics YOLOv8 official Github repository
  • [10] Terven J., Cordova-Esparza D.-M., Romero-Gonzalez J.-A., A Comprehensive Review of YOLO: From YOLOv1 to YOLOv8 and Beyond, Machine Learning and Knowledge Extraction, 5, 4, pp. 1680-1716, (2023)