Estimation of Off-Target Dicamba Damage on Soybean Using UAV Imagery and Deep Learning

被引:5
作者
Tian, Fengkai [1 ]
Vieira, Caio Canella [2 ]
Zhou, Jing [3 ]
Zhou, Jianfeng [4 ]
Chen, Pengyin [4 ]
机构
[1] Univ Missouri, Dept Biomed Biol & Chem Engn, Columbia, MO 65211 USA
[2] Univ Arkansas, Bumpers Coll, Crop Soil & Environm Sci, Fayetteville, AR 72701 USA
[3] Univ Wisconsin Madison, Biol Syst Engn, Madison, WI 53706 USA
[4] Univ Missouri, Div Plant Sci & Technol, Columbia, MO 65211 USA
关键词
soybean; dicamba tolerance; high-throughput phenotyping; deep learning; RECOGNITION; CNN;
D O I
10.3390/s23063241
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Weeds can cause significant yield losses and will continue to be a problem for agricultural production due to climate change. Dicamba is widely used to control weeds in monocot crops, especially genetically engineered dicamba-tolerant (DT) dicot crops, such as soybean and cotton, which has resulted in severe off-target dicamba exposure and substantial yield losses to non-tolerant crops. There is a strong demand for non-genetically engineered DT soybeans through conventional breeding selection. Public breeding programs have identified genetic resources that confer greater tolerance to off-target dicamba damage in soybeans. Efficient and high throughput phenotyping tools can facilitate the collection of a large number of accurate crop traits to improve the breeding efficiency. This study aimed to evaluate unmanned aerial vehicle (UAV) imagery and deep-learning-based data analytic methods to quantify off-target dicamba damage in genetically diverse soybean genotypes. In this research, a total of 463 soybean genotypes were planted in five different fields (different soil types) with prolonged exposure to off-target dicamba in 2020 and 2021. Crop damage due to off-target dicamba was assessed by breeders using a 1-5 scale with a 0.5 increment, which was further classified into three classes, i.e., susceptible (>= 3.5), moderate (2.0 to 3.0), and tolerant (<= 1.5). A UAV platform equipped with a red-green-blue (RGB) camera was used to collect images on the same days. Collected images were stitched to generate orthomosaic images for each field, and soybean plots were manually segmented from the orthomosaic images. Deep learning models, including dense convolutional neural network-121 (DenseNet121), residual neural network-50 (ResNet50), visual geometry group-16 (VGG16), and Depthwise Separable Convolutions (Xception), were developed to quantify crop damage levels. Results show that the DenseNet121 had the best performance in classifying damage with an accuracy of 82%. The 95% binomial proportion confidence interval showed a range of accuracy from 79% to 84% (p-value <= 0.01). In addition, no extreme misclassifications (i.e., misclassification between tolerant and susceptible soybeans) were observed. The results are promising since soybean breeding programs typically aim to identify those genotypes with 'extreme' phenotypes (e.g., the top 10% of highly tolerant genotypes). This study demonstrates that UAV imagery and deep learning have great potential to high-throughput quantify soybean damage due to off-target dicamba and improve the efficiency of crop breeding programs in selecting soybean genotypes with desired traits.
引用
收藏
页数:13
相关论文
共 48 条
[1]   Performance of deep learning models for classifying and detecting common weeds in corn and soybean production systems [J].
Ahmad, Aanis ;
Saraswat, Dharmendra ;
Aggarwal, Varun ;
Etienne, Aaron ;
Hancock, Benjamin .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2021, 184 (184)
[2]   Deep learning approach for recognition and classification of yield affecting paddy crop stresses using field images [J].
Anami, Basavaraj S. ;
Malvade, Naveen N. ;
Palaiah, Surendra .
ARTIFICIAL INTELLIGENCE IN AGRICULTURE, 2020, 4 :12-20
[3]  
[Anonymous], TENSORFLOW TRANSFER
[4]   Influence of integrated agronomic and weed management practices on soybean canopy development and yield [J].
Arsenijevic, Nikola ;
DeWerff, Ryan ;
Conley, Shawn ;
Ruark, Matthew ;
Werle, Rodrigo .
WEED TECHNOLOGY, 2022, 36 (01) :73-78
[5]   Deep Residual CNN with Contrast Limited Adaptive Histogram Equalization for Weed Detection in Soybean Crops [J].
Babu, Velpula Sekhara ;
Ram, Nidumolu Venkat .
TRAITEMENT DU SIGNAL, 2022, 39 (02) :717-722
[6]  
Bao W., IMPROVED DENSENET CN
[7]  
Caux P.-Y., 1993, Reviews of Environmental Contamination and Toxicology, V133, P1
[8]  
Chen YW, 2004, CROP SCI, V44, P316, DOI 10.2135/cropsci2004.0316
[9]   Estimating soybean leaf defoliation using convolutional neural networks and synthetic images [J].
da Silva, Lucas Abreu ;
Bressan, Patrik Ola ;
Goncalves, Diogo Nunes ;
Freitas, Daniel Matte ;
Machado, Bruno Brandoli ;
Goncalves, Wesley Nunes .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2019, 156 :360-368
[10]   STAGE OF DEVELOPMENT DESCRIPTIONS FOR SOYBEANS, GLYCINE-MAX (L) MERRILL [J].
FEHR, WR ;
CAVINESS, CE ;
BURMOOD, DT ;
PENNINGTON, JS .
CROP SCIENCE, 1971, 11 (06) :929-+