Automated crop plant detection based on the fusion of color and depth images for robotic weed control

被引:59
作者
Gai, Jingyao [1 ]
Tang, Lie [1 ]
Steward, Brian L. [1 ]
机构
[1] Iowa State Univ, Agr & Biosyst Engn, Ames, IA 50011 USA
基金
美国食品与农业研究所;
关键词
computer vision; crop detection; robotic weeding; sensor fusion; CLASSIFICATION; IDENTIFICATION; SEGMENTATION; ALGORITHM; SPACE;
D O I
10.1002/rob.21897
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Robotic weeding enables weed control near or within crop rows automatically, precisely and effectively. A computer-vision system was developed for detecting crop plants at different growth stages for robotic weed control. Fusion of color images and depth images was investigated as a means of enhancing the detection accuracy of crop plants under conditions of high weed population. In-field images of broccoli and lettuce were acquired 3-27 days after transplanting with a Kinect v2 sensor. The image processing pipeline included data preprocessing, vegetation pixel segmentation, plant extraction, feature extraction, feature-based localization refinement, and crop plant classification. For the detection of broccoli and lettuce, the color-depth fusion algorithm produced high true-positive detection rates (91.7% and 90.8%, respectively) and low average false discovery rates (1.1% and 4.0%, respectively). Mean absolute localization errors of the crop plant stems were 26.8 and 7.4 mm for broccoli and lettuce, respectively. The fusion of color and depth was proved beneficial to the segmentation of crop plants from background, which improved the average segmentation success rates from 87.2% (depth-based) and 76.4% (color-based) to 96.6% for broccoli, and from 74.2% (depth-based) and 81.2% (color-based) to 92.4% for lettuce, respectively. The fusion-based algorithm had reduced performance in detecting crop plants at early growth stages.
引用
收藏
页码:35 / 52
页数:18
相关论文
共 51 条
  • [1] An Approach to the Use of Depth Cameras for Weed Volume Estimation
    Andujar, Dionisio
    Dorado, Jose
    Fernandez-Quintanilla, Cesar
    Ribeiro, Angela
    [J]. SENSORS, 2016, 16 (07)
  • [2] An Ultrasonic System for Weed Detection in Cereal Crops
    Andujar, Dionisio
    Weis, Martin
    Gerhards, Roland
    [J]. SENSORS, 2012, 12 (12): : 17343 - 17357
  • [3] [Anonymous], 2018, DEEP LEARNING CRITIC
  • [4] Robot for weed species plant-specific management
    Bawden, Owen
    Kulk, Jason
    Russell, Ray
    McCool, Chris
    English, Andrew
    Dayoub, Feras
    Lehnert, Chris
    Perez, Tristan
    [J]. JOURNAL OF FIELD ROBOTICS, 2017, 34 (06) : 1179 - 1199
  • [5] A MODIFIED FAST PARALLEL ALGORITHM FOR THINNING DIGITAL PATTERNS
    CHEN, YS
    HSU, WH
    [J]. PATTERN RECOGNITION LETTERS, 1988, 7 (02) : 99 - 106
  • [6] A metrological characterization of the Kinect V2 time-of-flight camera
    Corti, Andrea
    Giancola, Silvio
    Mainetti, Giacomo
    Sala, Remo
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2016, 75 : 584 - 594
  • [7] Exploiting Offload Enabled Network Interfaces
    Di Girolamo, Salvatore
    Jolivet, Pierre
    Underwood, Keith D.
    Hoefler, Torsten
    [J]. PROCEEDINGS 2015 IEEE 23RD ANNUAL SYMPOSIUM ON HIGH-PERFORMANCE INTERCONNECTS - HOTI 2015, 2015, : 26 - 33
  • [8] Dyrmann M., 2017, Advances in Animal Biosciences, V8, P842, DOI [DOI 10.1017/S2040470017000206, 10.1017/s2040470017000206, 10.1017/S2040470017000206]
  • [9] Estimation of plant species by classifying plants and leaves in combination
    Dyrmann, Mads
    Christiansen, Peter
    Midtiby, Henrik Skov
    [J]. JOURNAL OF FIELD ROBOTICS, 2018, 35 (02) : 202 - 212
  • [10] Plant species classification using deep convolutional neural network
    Dyrmann, Mads
    Karstoft, Henrik
    Midtiby, Henrik Skov
    [J]. BIOSYSTEMS ENGINEERING, 2016, 151 : 72 - 80