Automatic Plant Counting and Location Based on a Few-Shot Learning Technique

被引:44
作者
Karami, Azam [1 ]
Crawford, Melba [2 ]
Delp, Edward [3 ]
机构
[1] Purdue Univ, Dept Agron, W Lafayette, IN 47907 USA
[2] Purdue Univ, Civil Engn & Agron Dept, W Lafayette, IN 47907 USA
[3] Purdue Univ, Sch Elect & Comp Engn, W Lafayette, IN 47907 USA
关键词
Detectors; Feature extraction; Training; Object detection; Remote sensing; Plants (biology); Agriculture; CenterNet; few-shot learning (FSL); image-based plant phenotyping; localization and counting; transfer learning (TL); LOW-ALTITUDE;
D O I
10.1109/JSTARS.2020.3025790
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Plant counting and location are essential for both plant breeding experiments and production agriculture. Stand count indicates the overall emergence of plants compared to the number of seeds that were planted, while location provides information on the associated variability within a plot or geographic area of a field. Deep learning has been successfully applied in various application domains, including plant phenotyping. This article proposes the use of deep learning techniques, more specifically, anchor-free detectors, to identify and count maize plants in RGB images acquired from unmanned aerial vehicles. The results were obtained using a modified CenterNet architecture, with validation performed against manual human annotation. Experimental results demonstrated an overall precision >95% for examples where training and testing were performed on the same field. Few-shot learning was also explored, where the trained network was 1) directly applied to the fields in other geographic areas and 2) updated using small quantities of training data from the other locations.
引用
收藏
页码:5872 / 5886
页数:15
相关论文
共 62 条
[21]   Deep learning in agriculture: A survey [J].
Kamilaris, Andreas ;
Prenafeta-Boldu, Francesc X. .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2018, 147 :70-90
[22]  
Karaagac A., 2020, 16 IEEE INT C FACTOR, P1, DOI 10.1109/WFCS47810.2020.9114490
[23]  
Kitano B.T., 2019, IEEE GEOSCI REMOTE S, V16, P1, DOI [10.1109/LGRS.2019.2930549, DOI 10.1109/LGRS.2019.2930549]
[24]   Estimation of crop plant density at early mixed growth stages using UAV imagery [J].
Koh, Joshua C. O. ;
Hayden, Matthew ;
Daetwyler, Hans ;
Kant, Surya .
PLANT METHODS, 2019, 15 (1)
[25]   FoveaBox: Beyound Anchor-Based Object Detection [J].
Kong, Tao ;
Sun, Fuchun ;
Liu, Huaping ;
Jiang, Yuning ;
Li, Lei ;
Shi, Jianbo .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 :7389-7398
[26]  
Lan S., 2020, PROC IEEE C COMPUT V, P1
[27]   CornerNet: Detecting Objects as Paired Keypoints [J].
Law, Hei ;
Deng, Jia .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2020, 128 (03) :642-656
[28]   One-shot learning of object categories [J].
Li, FF ;
Fergus, R ;
Perona, P .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2006, 28 (04) :594-611
[29]   Anchor-Free Single Stage Detector in Remote Sensing Images Based on Multiscale Dense Path Aggregation Feature Pyramid Network [J].
Li, Yangyang ;
Pei, Xuan ;
Huang, Qin ;
Mao, Licheng ;
Shang, Ronghua ;
Marturi, Naresh .
IEEE ACCESS, 2020, 8 :63121-63133
[30]   Focal Loss for Dense Object Detection [J].
Lin, Tsung-Yi ;
Goyal, Priya ;
Girshick, Ross ;
He, Kaiming ;
Dollar, Piotr .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :2999-3007