Crop Agnostic Monitoring Driven by Deep Learning

被引:27
作者
Halstead, Michael [1 ]
Ahmadi, Alireza [1 ]
Smitt, Claus [1 ]
Schmittmann, Oliver [1 ]
McCool, Chris [1 ]
机构
[1] Univ Bonn, Inst Agr Engn, Agr Robot, Bonn, Germany
关键词
plant classification; artificial intelligence; deep learning; convolutional neural network; image segmentation; field plant observation; ROW WEED-CONTROL;
D O I
10.3389/fpls.2021.786702
中图分类号
Q94 [植物学];
学科分类号
071001 ;
摘要
Farmers require diverse and complex information to make agronomical decisions about crop management including intervention tasks. Generally, this information is gathered by farmers traversing their fields or glasshouses which is often a time consuming and potentially expensive process. In recent years, robotic platforms have gained significant traction due to advances in artificial intelligence. However, these platforms are usually tied to one setting (such as arable farmland), or algorithms are designed for a single platform. This creates a significant gap between available technology and farmer requirements. We propose a novel field agnostic monitoring technique that is able to operate on two different robots, in arable farmland or a glasshouse (horticultural setting). Instance segmentation forms the backbone of this approach from which object location and class, object area, and yield information can be obtained. In arable farmland, our segmentation network is able to estimate crop and weed at a species level and in a glasshouse we are able to estimate the sweet pepper and their ripeness. For yield information, we introduce a novel matching criterion that removes the pixel-wise constraints of previous versions. This approach is able to accurately estimate the number of fruit (sweet pepper) in a glasshouse with a normalized absolute error of 4.7% and an R-2 of 0.901 with the visual ground truth. When applied to cluttered arable farmland scenes it improves on the prior approach by 50%. Finally, a qualitative analysis shows the validity of this agnostic monitoring algorithm by supplying decision enabling information to the farmer such as the impact of a low level weeding intervention scheme.
引用
收藏
页数:16
相关论文
共 44 条
[11]   Searching for people using semantic soft biometric descriptions [J].
Denman, Simon ;
Halstead, Michael ;
Fookes, Clinton ;
Sridharan, Sridha .
PATTERN RECOGNITION LETTERS, 2015, 68 :306-315
[12]   Thorvald II - a Modular and Re-configurable Agricultural Robot [J].
Grimstad, Lars ;
From, Pal Johan .
IFAC PAPERSONLINE, 2017, 50 (01) :4588-4593
[13]  
Halstead M, 2020, P DIGITAL IMAGE COMP
[14]   Fruit Quantity and Ripeness Estimation Using a Robotic Vision System [J].
Halstead, Michael ;
McCool, Christopher ;
Denman, Simon ;
Perez, Tristan ;
Fookes, Clinton .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (04) :2995-3002
[15]   Remote Control of Greenhouse Vegetable Production with Artificial IntelligenceGreenhouse Climate, Irrigation, and Crop Production [J].
Hemming, Silke ;
de Zwart, Feije ;
Elings, Anne ;
Righini, Isabella ;
Petropoulou, Anna .
SENSORS, 2019, 19 (08)
[16]  
Hung C, 2013, IEEE INT C INT ROBOT, P5314, DOI 10.1109/IROS.2013.6697125
[17]   Slow and steady feature analysis: higher order temporal coherence in video [J].
Jayaraman, Dinesh ;
Grauman, Kristen .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :3852-3861
[18]   L*a*b*Fruits: A Rapid and Robust Outdoor Fruit Detection System Combining Bio-Inspired Features with One-Stage Deep Learning Networks [J].
Kirk, Raymond ;
Cielniak, Grzegorz ;
Mangan, Michael .
SENSORS, 2020, 20 (01)
[19]   Deep learning for real-time fruit detection and orchard fruit load estimation: benchmarking of 'MangoYOLO' [J].
Koirala, A. ;
Walsh, K. B. ;
Wang, Z. ;
McCarthy, C. .
PRECISION AGRICULTURE, 2019, 20 (06) :1107-1135
[20]   Autonomous Sweet Pepper Harvesting for Protected Cropping Systems [J].
Lehnert, Christopher ;
English, Andrew ;
McCool, Christopher ;
Tow, Adam W. ;
Perez, Tristan .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2017, 2 (02) :872-879