Using a depth camera for crop row detection and mapping for under-canopy navigation of agricultural robotic vehicle

被引:72
作者
Gai, Jingyao [1 ,2 ]
Xiang, Lirong [1 ]
Tang, Lie [1 ]
机构
[1] Iowa State Univ, Dept Agr & Biosyst Engn, Ames, IA 50011 USA
[2] Guanxi Univ, Coll Mechatron Engn, Nanning, Peoples R China
基金
美国国家科学基金会;
关键词
Agricultural robot navigation; Depth imaging; Inter-row positioning; Field mapping; Under-canopy imaging;
D O I
10.1016/j.compag.2021.106301
中图分类号
S [农业科学];
学科分类号
09 ;
摘要
Computer vision provides local environmental information for robotic navigation in crop fields. It is particularly useful for robots operating under canopies of tall plants such as corns (Zea Mays) and sorghums (Sorghum bicolor), where GPS signal is not always receivable. However, the development of under-canopy navigation systems is still an open research area. The key contribution of our work is the development of a vision-based system for undercanopy navigation using a Time-of-Flight (ToF) camera. In the system, a novel algorithm was used to detect parallel crop rows from depth images taken under crop canopies. Two critical tasks in navigation were accomplished based on the detection results: 1) generating crop field maps as occupancy grids when reliable robot localization is available (from other sources such as GPS and IMU), and 2) providing inter-row vehicle positioning data when the field map is available and the localization is not reliable. The proposed system was evaluated in field tests. The test results showed that the proposed system was able to map the crop rows with mean absolute errors (MAE) of 3.4 cm and 3.6 cm in corn and sorghum fields, respectively. It provides lateral positioning data with MAE of 5.0 cm and 4.2 cm for positioning in corn and sorghum crop rows, respectively. The potential and limitations of using ToF cameras for under-canopy navigation were discussed.
引用
收藏
页数:14
相关论文
共 36 条
  • [1] Assefa Y., 2014, Corn and Grain Sorghum Comparison, P3, DOI [DOI 10.1016/B978-0-12800112-7.00002-9, 10.1016/b978-0-12800112-7.00002-9]
  • [2] Field-based architectural traits characterisation of maize plant using time-of-flight 3D imaging
    Bao, Yin
    Tang, Lie
    Srinivasan, Srikant
    Schnable, Patrick S.
    [J]. BIOSYSTEMS ENGINEERING, 2019, 178 : 86 - 101
  • [3] Behrje U, 2018, I C CONT AUTOMAT ROB, P1739, DOI 10.1109/ICARCV.2018.8581085
  • [4] Bell J, 2016, 2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), P640, DOI 10.1109/IROS.2016.7759120
  • [5] Bergerman M, 2016, SPRINGER HANDBOOK OF ROBOTICS, P1463
  • [6] Robot Farmers Autonomous Orchard Vehicles Help Tree Fruit Production
    Bergerman, Marcel
    Maeta, Silvio M.
    Zhang, Ji
    Freitas, Gustavo M.
    Hamner, Bradley
    Singh, Sanjiv
    Kantor, George
    [J]. IEEE ROBOTICS & AUTOMATION MAGAZINE, 2015, 22 (01) : 54 - 63
  • [7] A field-tested robotic harvesting system for iceberg lettuce
    Birrell, Simon
    Hughes, Josie
    Cai, Julia Y.
    Iida, Fumiya
    [J]. JOURNAL OF FIELD ROBOTICS, 2020, 37 (02) : 225 - 245
  • [8] An overview of autonomous crop row navigation strategies for unmanned ground vehicles
    Bonadies S.
    Gadsden S.A.
    [J]. Engineering in Agriculture, Environment and Food, 2019, 12 (01): : 24 - 31
  • [9] Chambers L.G., 2001, MATH GAZ, V85, P562, DOI [DOI 10.2307/3621816, 10.2307/3621816]
  • [10] English A, 2014, IEEE INT CONF ROBOT, P1693, DOI 10.1109/ICRA.2014.6907079