R-LVIO: Resilient LiDAR-Visual-Inertial Odometry for UAVs in GNSS-denied Environment

被引:0
|
作者
Zhang, Bing [1 ]
Shao, Xiangyu [1 ]
Wang, Yankun [1 ]
Sun, Guanghui [1 ]
Yao, Weiran [1 ]
机构
[1] Harbin Inst Technol, Dept Control Sci & Engn, Harbin 150001, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Multi-sensor fusion; LiDAR-visual-inertial odometry; structure quantification; point-to-surface alignment; REAL-TIME; ROBUST; LOAM;
D O I
10.3390/drones8090487
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
In low-altitude, GNSS-denied scenarios, Unmanned aerial vehicles (UAVs) rely on sensor fusion for self-localization. This article presents a resilient multi-sensor fusion localization system that integrates light detection and ranging (LiDAR), cameras, and inertial measurement units (IMUs) to achieve state estimation for UAVs. To address challenging environments, especially unstructured ones, IMU predictions are used to compensate for pose estimation in the visual and LiDAR components. Specifically, the accuracy of IMU predictions is enhanced by increasing the correction frequency of IMU bias through data integration from the LiDAR and visual modules. To reduce the impact of random errors and measurement noise in LiDAR points on visual depth measurement, cross-validation of visual feature depth is performed using reprojection error to eliminate outliers. Additionally, a structure monitor is introduced to switch operation modes in hybrid point cloud registration, ensuring accurate state estimation in both structured and unstructured environments. In unstructured scenes, a geometric primitive capable of representing irregular planes is employed for point-to-surface registration, along with a novel pose-solving method to estimate the UAV's pose. Both private and public datasets collected by UAVs validate the proposed system, proving that it outperforms state-of-the-art algorithms by at least 12.6%.
引用
收藏
页数:20
相关论文
共 14 条
  • [1] FT-LVIO: Fully Tightly coupled LiDAR-Visual-Inertial odometry
    Zhang, Zhuo
    Yao, Zheng
    Lu, Mingquan
    IET RADAR SONAR AND NAVIGATION, 2023, 17 (05) : 759 - 771
  • [2] UAV Navigation With Monocular Visual Inertial Odometry Under GNSS-Denied Environment
    Luo, Haolong
    Li, Guangyun
    Zou, Danping
    Li, Kailin
    Li, Xueqiang
    Yang, Zidi
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [3] LVIO-Fusion:Tightly-Coupled LiDAR-Visual-Inertial Odometry and Mapping in Degenerate Environments
    Zhang, Hongkai
    Du, Liang
    Bao, Sheng
    Yuan, Jianjun
    Ma, Shugen
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (04) : 3783 - 3790
  • [4] LiDAR-Visual-Inertial Odometry Based on Optimized Visual Point-Line Features
    He, Xuan
    Gao, Wang
    Sheng, Chuanzhen
    Zhang, Ziteng
    Pan, Shuguo
    Duan, Lijin
    Zhang, Hui
    Lu, Xinyu
    REMOTE SENSING, 2022, 14 (03)
  • [5] F-LVINS: Flexible Lidar-Visual-Inertial Odometry Systems
    Tang, Xiang-Shi
    Cheng, Teng-Hu
    IEEE ACCESS, 2023, 11 : 104028 - 104037
  • [6] Radar and Visual Odometry Integrated System Aided Navigation for UAVS in GNSS Denied Environment
    Mostafa, Mostafa
    Zahran, Shady
    Moussa, Adel
    El-Sheimy, Naser
    Sesay, Abu
    SENSORS, 2018, 18 (09)
  • [7] IMU Augment Tightly Coupled Lidar-Visual-Inertial Odometry for Agricultural Environments
    Hoang, Quoc Hung
    Kim, Gon-Woo
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (10): : 8483 - 8490
  • [8] Efficient and adaptive lidar-visual-inertial odometry for agricultural unmanned ground vehicle
    Zhao, Zixu
    Zhang, Yucheng
    Long, Long
    Lu, Zaiwang
    Shi, Jinglin
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2022, 19 (02):
  • [9] GNSS-Denied Semi-Direct Visual Navigation for Autonomous UAVs Aided by PI-Inspired Inertial Priors
    Gallo, Eduardo
    Barrientos, Antonio
    AEROSPACE, 2023, 10 (03)
  • [10] Dynam-LVIO: A Dynamic-Object-Aware LiDAR Visual Inertial Odometry in Dynamic Urban Environments
    Shi, Jian
    Wang, Wei
    Qi, Mingyang
    Li, Xin
    Yan, Ye
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 : 1 - 19