Lidar/MEMS IMU/Odometer integrated tightly navigation algorithm

被引:0
|
作者
Zhang F. [1 ]
Wang K. [1 ]
Liao W. [2 ]
Sun C. [1 ]
机构
[1] School of Marine Science and Technology, Northwestern Polytechnical University, Xi'an
[2] Chinese Flight Test Establishment, Xi'an
关键词
Laser slam; Multi sensor; Point cloud distortion correction; Tight combination;
D O I
10.19650/j.cnki.cjsi.J2209599
中图分类号
学科分类号
摘要
To improve the robustness and stability of the robot navigation system in an unknown and complex environment, a Lidar/MEMS IMU/Odometer integrated tightly navigation algorithm is proposed. Firstly, the algorithm corrects the distortion point cloud generated by the lidar movement through the pre-integration of the MEMS IMU/Odometer to improve the feature matching efficiency between two frames of the point cloud. Secondly, the linearly interpolation of the pre-integrated robot posture is implemented according to the timestamp to obtain the rough position change between two frames of the point cloud. This rough pose changing is used as the initial value of the optimization algorithm iteration to reduce the number of iterations of the optimization algorithm. Then, the motion constraint of MEMS IMU/Odometer is added to the back-end optimization, and the multi-sensor joint optimization is used to improve the positioning accuracy of the robot. Finally, the simulation experiment is carried out using the data set. The indoor and outdoor opening and closing loop experiments are implemented by using the four-wheeled trolley. Experiments show that the average outdoor open-loop positioning error of this algorithm is reduced by 51.01% and 24.75% respectively compared with the traditional algorithms ALOAM and LEGO-LOAM, respectively, and it can maintain high accuracy when the movement such as cornering is intense. © 2022, Science Press. All right reserved.
引用
收藏
页码:139 / 148
页数:9
相关论文
共 15 条
  • [1] HANAFI D, ABUEEJELA Y M, ZAKARIA M F., Wall follower autonomous robot development applying fuzzy incremental controller, Intelligent Control and Automation, 4, 1, (2013)
  • [2] ZHANG S, XIAO L, NIE Y, Et al., Lidar odometry and mapping based on two-stage feature extraction, 39th Chinese Control Conference (CCC), (2020)
  • [3] WANG J, LIU Q., Research on MEMS sensor error modeling technology, Instrumentation and Measurment, 40, 5, pp. 133-136, (2021)
  • [4] ALISMAIL H, BAKER L D, BROWNING B., Continuous trajectory estimation for 3D SLAM from actuated lidar, IEEE International Conference on Robotics & Automation, (2014)
  • [5] LI Q SH, ZHAO Y, WANG J D., A vision aided MEMS-SINS/GPS ultra-tight coupled navigation.system suitable for high dynamic and strong interference environment, Acta Armamentarii, 40, 11, pp. 2241-2249, (2019)
  • [6] ZHANG J, SINGH S., Low-drift and real-time lidar odometry and mapping[J], Autonomous Robots, 41, 2, pp. 401-416, (2017)
  • [7] SUN M H, YANG SH W, YI X D, Et al., Autonomous navigation of robot in large-scale environments based on GIS and SLAM, Chinese Journal of Scientific Instrument, 38, 3, pp. 586-592, (2017)
  • [8] ZHANG J, SINGH S., LOAM: Lidar odometry and mapping in real-time, Robotics: Science and Systems Conference, (2014)
  • [9] ZHANG J, SINGH S., Visual-lidar odometry and mapping: Low-drift, robust, and fast, IEEE International Conference on Robotics & Automation, (2015)
  • [10] JIANG W C., Implementation of odometry with EKF in hector SLAM methods[J], International Journal of Automation & Smart Technology, 8, 1, pp. 9-18, (2018)