Deep Learning-Based Weed Detection Using UAV Images: A Comparative Study

被引:23
作者
Shahi, Tej Bahadur [1 ,2 ]
Dahal, Sweekar [3 ]
Sitaula, Chiranjibi [4 ]
Neupane, Arjun [1 ]
Guo, William [1 ]
机构
[1] Cent Queensland Univ, Sch Engn & Technol, North Rockhampton, Qld 4701, Australia
[2] Tribhuvan Univ, Cent Dept Comp Sceince & IT, Kathmandu 44600, Nepal
[3] Tribhuvan Univ, Inst Engn, Kathmandu 44600, Nepal
[4] Univ Melbourne, Dept Infrastruct Engn, Earth Observat & AI Res Grp, Parkville, Vic 3010, Australia
关键词
semantic segmentation; UAV; drones; deep learning; weed detection; precision agriculture; FOOD;
D O I
10.3390/drones7100624
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
Semantic segmentation has been widely used in precision agriculture, such as weed detection, which is pivotal to increasing crop yields. Various well-established and swiftly evolved AI models have been developed of late for semantic segmentation in weed detection; nevertheless, there is insufficient information about their comparative study for optimal model selection in terms of performance in this field. Identifying such a model helps the agricultural community make the best use of technology. As such, we perform a comparative study of cutting-edge AI deep learning-based segmentation models for weed detection using an RGB image dataset acquired with UAV, called CoFly-WeedDB. For this, we leverage AI segmentation models, ranging from SegNet to DeepLabV3+, combined with five backbone convolutional neural networks (VGG16, ResNet50, DenseNet121, EfficientNetB0 and MobileNetV2). The results show that UNet with EfficientNetB0 as a backbone CNN is the best-performing model compared with the other candidate models used in this study on the CoFly-WeedDB dataset, imparting Precision (88.20%), Recall (88.97%), F1-score (88.24%) and mean Intersection of Union (56.21%). From this study, we suppose that the UNet model combined with EfficientNetB0 could potentially be used by the concerned stakeholders (e.g., farmers, the agricultural industry) to detect weeds more accurately in the field, thereby removing them at the earliest point and increasing crop yields.
引用
收藏
页数:18
相关论文
共 61 条
[51]   Rethinking the Inception Architecture for Computer Vision [J].
Szegedy, Christian ;
Vanhoucke, Vincent ;
Ioffe, Sergey ;
Shlens, Jon ;
Wojna, Zbigniew .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :2818-2826
[52]  
Tan MX, 2021, PR MACH LEARN RES, V139, P7102
[53]  
Tan MX, 2019, PR MACH LEARN RES, V97
[54]   A computer vision approach for weeds identification through Support Vector Machines [J].
Tellaeche, Alberto ;
Pajares, Gonzalo ;
Burgos-Artizzu, Xavier P. ;
Ribeiro, Angela .
APPLIED SOFT COMPUTING, 2011, 11 (01) :908-915
[55]   Focal Loss for Dense Object Detection [J].
Lin, Tsung-Yi ;
Goyal, Priya ;
Girshick, Ross ;
He, Kaiming ;
Dollar, Piotr .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :2999-3007
[56]   A meta-analysis of projected global food demand and population at risk of hunger for the period 2010-2050 [J].
van Dijk, Michiel ;
Morley, Tom ;
Rau, Marie Luise ;
Saghai, Yashar .
NATURE FOOD, 2021, 2 (07) :494-+
[57]   Comparison of Object Detection and Patch-Based Classification Deep Learning Models on Mid- to Late-Season Weed Detection in UAV Imagery [J].
Veeranampalayam Sivakumar, Arun Narenthiran ;
Li, Jiating ;
Scott, Stephen ;
Psota, Eric ;
J. Jhala, Amit ;
Luck, Joe D. ;
Shi, Yeyin .
REMOTE SENSING, 2020, 12 (13)
[58]   Deep learning on edge: Extracting field boundaries from satellite images with a convolutional neural network [J].
Waldner, Francois ;
Diakogiannis, Foivos, I .
REMOTE SENSING OF ENVIRONMENT, 2020, 245 (245)
[59]   Review of Weed Detection Methods Based on Computer Vision [J].
Wu, Zhangnan ;
Chen, Yajun ;
Zhao, Bo ;
Kang, Xiaobing ;
Ding, Yuanyuan .
SENSORS, 2021, 21 (11)
[60]   Instance segmentation method for weed detection using UAV imagery in soybean fields [J].
Xu, Beibei ;
Fan, Jiahao ;
Chao, Jun ;
Arsenijevic, Nikola ;
Werle, Rodrigo ;
Zhang, Zhou .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2023, 211