YOLO-CWD: A novel model for crop and weed detection based on improved YOLOv8

被引:0
作者
Ma, Chaoran [1 ,2 ]
Chi, Ge [1 ,2 ]
Ju, Xueping [1 ]
Zhang, Junqiang [1 ,3 ]
Yan, Changxiang [1 ,4 ]
机构
[1] Chinese Acad Sci, Changchun Inst Opt Fine Mech & Phys, Changchun 130033, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
[3] YUSENSE Informat Technol & Equipment, Qingdao 266000, Peoples R China
[4] Univ Chinese Acad Sci, Ctr Mat Sci & Optoelect Engn, Beijing 100049, Peoples R China
关键词
Crop detection; Deep learning; Precision agriculture; Weed detection; YOLO;
D O I
10.1016/j.cropro.2025.107169
中图分类号
S3 [农学(农艺学)];
学科分类号
0901 ;
摘要
Accurate and efficient crop and weed detection is pivotal for advancing precision agriculture. In this context, the development of lightweight and high-performance models is crucial for real-time applications. This study introduces YOLO-CWD, an improved version of the You Only Look Once version 8 n (YOLOv8n) model, designed to achieve high detection accuracy while maintaining a compact scale in real-time crop and weed detection tasks. A novel hybrid attention mechanism was proposed, enhancing the model's capacity to differentiate between crops and weeds, and outperforming other commonly used attention mechanisms. To address the limitations of the Complete Intersection over Union (CIoU) loss function in accurately locating predicted bounding boxes, a novel loss function, Point Intersection over Union (PIoU), was proposed, accelerating convergence during training. The resulting YOLO-CWD model features 3.49M parameters and 9.6 GFLOPS, balancing compactness and performance. Ablation experiments demonstrate significant improvements in detecting maize and weeds, with mAP@50 reaching 0.751 and mAP@50:95 reaching 0.506, increasing by 0.008 and 0.012, respectively, compared to the baseline. The model outperforms other state-of-the-art models, with its robustness validated across multiple datasets. Further evaluation under varying lighting and soil moisture conditions highlights the model's strong generalization capabilities. These findings confirm YOLO-CWD's superiority for crop and weed detection tasks, addressing key challenges in precision agriculture and paving the way for sustainable practices.
引用
收藏
页数:11
相关论文
共 36 条
  • [1] Learning Semantic Graphics Using Convolutional Encoder-Decoder Network for Autonomous Weeding in Paddy
    Adhikari, Shyam Prasad
    Yang, Heechan
    Kim, Hyongsuk
    [J]. FRONTIERS IN PLANT SCIENCE, 2019, 10
  • [2] Automated Pest Detection With DNN on the Edge for Precision Agriculture
    Albanese, Andrea
    Nardello, Matteo
    Brunelli, Davide
    [J]. IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS, 2021, 11 (03) : 458 - 467
  • [3] Laser Weeding With Small Autonomous Vehicles: Friends or Foes?
    Andreasen, Christian
    Scholle, Karsten
    Saberi, Mahin
    [J]. FRONTIERS IN AGRONOMY, 2022, 4
  • [4] EFFECTS OF FLAME WEEDING ON WEED SPECIES AT DIFFERENT DEVELOPMENTAL STAGES
    ASCARD, J
    [J]. WEED RESEARCH, 1995, 35 (05) : 397 - 411
  • [5] Cao YH, 2020, PROC CVPR IEEE, P11580, DOI 10.1109/CVPR42600.2020.01160
  • [6] Cloutier DC, 2001, PHYSICAL CONTROL METHODS IN PLANT PROTECTION, P191
  • [7] Weed database development: An updated survey of public weed datasets and cross-season weed detection adaptation
    Deng, Boyang
    Lu, Yuzhen
    Xu, Jiajun
    [J]. ECOLOGICAL INFORMATICS, 2024, 81
  • [8] RVDR-YOLOv8: A Weed Target Detection Model Based on Improved YOLOv8
    Ding, Yuanming
    Jiang, Chen
    Song, Lin
    Liu, Fei
    Tao, Yunrui
    [J]. ELECTRONICS, 2024, 13 (11)
  • [9] CenterNet: Keypoint Triplets for Object Detection
    Duan, Kaiwen
    Bai, Song
    Xie, Lingxi
    Qi, Honggang
    Huang, Qingming
    Tian, Qi
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 6568 - 6577
  • [10] Deep learning based weed detection and target spraying robot system at seedling stage of cotton field
    Fan, Xiangpeng
    Chai, Xiujuan
    Zhou, Jianping
    Sun, Tan
    [J]. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2023, 214