Map Construction and Positioning Method for LiDAR SLAM-Based Navigation of an Agricultural Field Inspection Robot

被引:3
作者
Qu, Jiwei [1 ]
Qiu, Zhinuo [1 ]
Li, Lanyu [1 ]
Guo, Kangquan [2 ]
Li, Dan [3 ]
机构
[1] Yangzhou Univ, Sch Mech Engn, Yangzhou 225127, Peoples R China
[2] Northwest A&F Univ, Coll Mech & Elect Engn, Xianyang 712100, Peoples R China
[3] Yangzhou Polytech Inst, Coll Intelligent Mfg, Yangzhou 225127, Peoples R China
来源
AGRONOMY-BASEL | 2024年 / 14卷 / 10期
关键词
field robots; navigation; SLAM; unmanned operations; automation; testing; LOCALIZATION; ALGORITHM;
D O I
10.3390/agronomy14102365
中图分类号
S3 [农学(农艺学)];
学科分类号
0901 ;
摘要
In agricultural field inspection robots, constructing accurate environmental maps and achieving precise localization are essential for effective Light Detection And Ranging (LiDAR) Simultaneous Localization And Mapping (SLAM) navigation. However, navigating in occluded environments, such as mapping distortion and substantial cumulative errors, presents challenges. Although current filter-based algorithms and graph optimization-based algorithms are exceptionally outstanding, they exhibit a high degree of complexity. This paper aims to investigate precise mapping and localization methods for robots, facilitating accurate LiDAR SLAM navigation in agricultural environments characterized by occlusions. Initially, a LiDAR SLAM point cloud mapping scheme is proposed based on the LiDAR Odometry And Mapping (LOAM) framework, tailored to the operational requirements of the robot. Then, the GNU Image Manipulation Program (GIMP) is employed for map optimization. This approach simplifies the map optimization process for autonomous navigation systems and aids in converting the Costmap. Finally, the Adaptive Monte Carlo Localization (AMCL) method is implemented for the robot's positioning, using sensor data from the robot. Experimental results highlight that during outdoor navigation tests, when the robot operates at a speed of 1.6 m/s, the average error between the mapped values and actual measurements is 0.205 m. The results demonstrate that our method effectively prevents navigation mapping distortion and facilitates reliable robot positioning in experimental settings.
引用
收藏
页数:19
相关论文
共 35 条
[11]   Remote-Sensing Data and Deep-Learning Techniques in Crop Mapping and Yield Prediction: A Systematic Review [J].
Joshi, Abhasha ;
Pradhan, Biswajeet ;
Gite, Shilpa ;
Chakraborty, Subrata .
REMOTE SENSING, 2023, 15 (08)
[12]   Experimental 2D extended Kalman filter sensor fusion for low-cost GNSS/IMU/Odometers precise positioning system [J].
Kaczmarek, Adrian ;
Rohm, Witold ;
Klingbeil, Lasse ;
Tchorzewski, Janusz .
MEASUREMENT, 2022, 193
[13]   Comparative Study on Simulated Outdoor Navigation for Agricultural Robots [J].
Khanzada, Feeza Khan ;
Delavari, Elahe ;
Jeong, Woojin ;
Cho, Young Seek ;
Kwon, Jaerock .
SENSORS, 2024, 24 (08)
[14]   P-AgSLAM: In-Row and Under-Canopy SLAM for Agricultural Monitoring in Cornfields [J].
Kim, Kitae ;
Deb, Aarya ;
Cappelleri, David J. .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (06) :4982-4989
[15]  
[李晨阳 Li Chenyang], 2021, [农业工程学报, Transactions of the Chinese Society of Agricultural Engineering], V37, P16
[16]   Performance evaluation of 2D LiDAR SLAM algorithms in simulated orchard environments [J].
Li, Qiujie ;
Zhu, Hongyi .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2024, 221
[17]  
[刘成良 Liu Chengliang], 2022, [农业机械学报, Transactions of the Chinese Society for Agricultural Machinery], V53, P1
[18]   Autonomous navigation system for greenhouse tomato picking robots based on laser SLAM [J].
Liu, Kenan ;
Yu, Jingrong ;
Huang, Zhaowei ;
Liu, Li ;
Shi, Yinggang .
ALEXANDRIA ENGINEERING JOURNAL, 2024, 100 :208-219
[19]   Precision Inter-Row Relative Positioning Method by Using 3D LiDAR in Planted Forests and Orchards [J].
Liu, Limin ;
Ji, Dong ;
Zeng, Fandi ;
Zhao, Zhihuan ;
Wang, Shubo .
AGRONOMY-BASEL, 2024, 14 (06)
[20]   Laser 3D tightly coupled mapping method based on visual information [J].
Liu, Sixing ;
Chai, Yan ;
Yuan, Rui ;
Miao, Hong .
INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2023, 50 (06) :917-929