Robotic Fertilisation Using Localisation Systems Based on Point Clouds in Strip-Cropping Fields

被引:12
作者
Cruz Ulloa, Christyan [1 ]
Krus, Anne [2 ]
Barrientos, Antonio [1 ]
Del Cerro, Jaime [1 ]
Valero, Constantino [2 ]
机构
[1] Univ Politecn Madrid, Ctr Automat & Robot, CSIC, Madrid 28006, Spain
[2] Univ Politecn Madrid, Dept Ingn Agroforestal, ETSI Agronom Alimentaria & Biosistemas, Madrid 28040, Spain
来源
AGRONOMY-BASEL | 2021年 / 11卷 / 01期
关键词
organic farming; ROS; strip cropping; robotic systems; point cloud localisation; lidar; REGISTRATION; ALGORITHM;
D O I
10.3390/agronomy11010011
中图分类号
S3 [农学(农艺学)];
学科分类号
0901 ;
摘要
The use of robotic systems in organic farming has taken on a leading role in recent years; the Sureveg CORE Organic Cofund ERA-Net project seeks to evaluate the benefits of strip-cropping to produce organic vegetables. This includes, among other objectives, the development of a robotic tool that facilitates the automation of the fertilisation process, allowing the individual treatment (at the plant level). In organic production, the slower nutrient release of the used fertilisers poses additional difficulties, as a tardy detection of deficiencies can no longer be corrected. To improve the detection, as well as counter the additional labour stemming from the strip-cropping configuration, an integrated robotic tool is proposed to detect individual crop deficiencies and react on a single-crop basis. For the development of this proof-of-concept, one of the main objectives of this work is implementing a robust localisation method within the vegetative environment based on point clouds, through the generation of general point cloud maps (G-PC) and local point cloud maps (L-PC) of a crop row. The plants' geometric characteristics were extracted from the G-PC as a framework in which the robot's positioning is defined. Through the processing of real-time lidar data, the L-PC is then defined and compared to the predefined reference system previously deduced. Both subsystems are integrated with ROS (Robot Operating System), alongside motion planning, and an inverse kinematics CCD (Cyclic Coordinate Descent) solver, among others. Tests were performed using a simulated environment of the crop row developed in Gazebo, followed by actual measurements in a strip-cropping field. During real-time data-acquisition, the localisation error is reduced from 13 mm to 11 mm within the first 120 cm of measurement. The encountered real-time geometric characteristics were found to coincide with those in the G-PC to an extend of 98.6%.
引用
收藏
页数:15
相关论文
共 56 条
[1]   Local Motion Planner for Autonomous Navigation in Vineyards with a RGB-D Camera-Based Algorithm and Deep Learning Synergy [J].
Aghi, Diego ;
Mazzia, Vittorio ;
Chiaberge, Marcello .
MACHINES, 2020, 8 (02)
[2]   Cost-effective Indoor Localization for Autonomous Robots using Kinect and WiFi Sensors [J].
Alves, Raulcezar ;
de Morais, Josue Silva ;
Yamanaka, Keiji .
INTELIGENCIA ARTIFICIAL-IBEROAMERICAL JOURNAL OF ARTIFICIAL INTELLIGENCE, 2020, 23 (65) :33-55
[3]   Improved LiDAR Probabilistic Localization for Autonomous Vehicles Using GNSS [J].
Angel de Miguel, Miguel ;
Garcia, Fernando ;
Maria Armingol, Jose .
SENSORS, 2020, 20 (11)
[4]  
[Anonymous], 2014, ISPRS ANN PHOTOGRAMM, DOI DOI 10.5194/ISPRSANNALS-II-3-57-2014
[5]  
[Anonymous], 2012, EFFIC FAST INITIAL A, DOI DOI 10.5815/IJISA.2012.01.03
[6]  
Asociacion Espanola Agricultura de Conservacion Suelos Vivos, SIT ACT AGR CONS ESP
[7]  
Barnes E. M., 1996, Precision agriculture. Proceedings of the 3rd International Conference, Minneapolis, Minnesota, USA, 23-26 June 1996., P845
[8]  
Burkhard B., 2020, ASSESSMENT RELATIONS, DOI [10.1101/2020.05.26.116285, DOI 10.1101/2020.05.26.116285]
[9]   3D MODELLING AND VISUALIZATION BASED ON THE UNITY GAME ENGINE - ADVANTAGES AND CHALLENGES [J].
Buyuksalih, Ismail ;
Bayburt, Serdar ;
Buyuksalih, Gurcan ;
Baskaraca, A. P. ;
Karim, Hairi ;
Rahman, Alias Abdul .
4TH INTERNATIONAL GEOADVANCES WORKSHOP - GEOADVANCES 2017: ISPRS WORKSHOP ON MULTI-DIMENSIONAL & MULTI-SCALE SPATIAL DATA MODELING, 2017, 4-4 (W4) :161-166
[10]  
Cofund C.O., SUREVEG PROJECT