Real-time recognition research for an automated egg-picking robot in free-range duck sheds

被引:0
作者
Jie, Dengfei [1 ]
Wang, Jun [1 ]
Wang, Hao [1 ]
Lv, Huifang [1 ]
He, Jincheng [1 ]
Wei, Xuan [1 ]
机构
[1] Fujian Agr & Forestry Univ, Coll Mech & Elect Engn, 15 Shangxiadian Rd, Fuzhou 350002, Peoples R China
关键词
Duck egg-picking robot; Duck egg detection; YOLOv5s; Lightweight model; Attention mechanism;
D O I
10.1007/s11554-025-01640-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Achieving efficient and accurate detection and localization of duck eggs in the unstructured environment of free-range duck sheds is crucial for developing automated egg-picking robots. This paper proposes an improved YOLOv5s-based model (YOLOv5s-MNKS) designed to enhance detection performance, reduce model complexity, and improve the robot's adaptability and operational efficiency in complex environments. The model utilizes MobileNetV3 as the backbone network, reducing the number of parameters and increasing detection speed. The Squeeze-and-Excitation Module is replaced with a Normalization-based Attention Module to improve feature extraction capability. Group Shuffle Convolution and Bidirectional Feature Pyramid Network are introduced in the Neck layer, enhancing multi-scale feature fusion while reducing parameter count. A Soft-CIoU-NMS loss function is also designed, which improves detection accuracy in scenarios involving dense stacking and occlusion by lowering the confidence of overlapping bounding boxes instead of directly eliminating them. Experimental results demonstrate that the mAP of YOLOv5s-MNKS reaches 95.6%, representing a 0.3% improvement over the original model, while the model size is reduced to 5.7 MB, approximately 40% of the original size. When deployed on the Jetson Nano embedded platform with TensorRT acceleration, the model achieves a detection frame rate of 22.3 frames per second. In simulated and real-world duck shed scenarios, the improved model accurately and quickly identifies and locates duck eggs in complex environments, including occlusion, stacking, and low lighting, demonstrating strong robustness and applicability. This research provides technical support for the future development of duck egg-picking robots.
引用
收藏
页数:15
相关论文
共 37 条
[1]   Soft-NMS - Improving Object Detection With One Line of Code [J].
Bodla, Navaneeth ;
Singh, Bharat ;
Chellappa, Rama ;
Davis, Larry S. .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :5562-5570
[2]   Eggshell crack detection using deep convolutional neural networks [J].
Botta, Bhavya ;
Gattam, Sai Swaroop Reddy ;
Datta, Ashis Kumar .
JOURNAL OF FOOD ENGINEERING, 2022, 315
[3]   Visual Guidance and Egg Collection Scheme for a Smart Poultry Robot for Free-Range Farms [J].
Chang, Chung-Liang ;
Xie, Bo-Xuan ;
Wang, Chia-Hui .
SENSORS, 2020, 20 (22) :1-20
[4]   A lightweight vehicles detection network model based on YOLOv5 [J].
Dong, Xudong ;
Yan, Shuai ;
Duan, Chaoqun .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2022, 113
[5]  
Howard AG, 2017, Arxiv, DOI arXiv:1704.04861
[6]   Pigeon cleaning behavior detection algorithm based on light-weight network [J].
Guo, Jianjun ;
He, Guohuang ;
Deng, Hao ;
Fan, Wenting ;
Xu, Longqin ;
Cao, Liang ;
Feng, Dachun ;
Li, Jingbin ;
Wu, Huilin ;
Lv, Jiawei ;
Liu, Shuangyin ;
Hassan, Shahbaz Gul .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2022, 199
[7]   GhostNet: More Features from Cheap Operations [J].
Han, Kai ;
Wang, Yunhe ;
Tian, Qi ;
Guo, Jianyuan ;
Xu, Chunjing ;
Xu, Chang .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, :1577-1586
[8]   Searching for MobileNetV3 [J].
Howard, Andrew ;
Sandler, Mark ;
Chu, Grace ;
Chen, Liang-Chieh ;
Chen, Bo ;
Tan, Mingxing ;
Wang, Weijun ;
Zhu, Yukun ;
Pang, Ruoming ;
Vasudevan, Vijay ;
Le, Quoc V. ;
Adam, Hartwig .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, :1314-1324
[9]  
Hu J, 2018, PROC CVPR IEEE, P7132, DOI [10.1109/TPAMI.2019.2913372, 10.1109/CVPR.2018.00745]
[10]  
Joffe B. P., 2017, 2017 ASABE ANN INT M, P1, DOI [https://doi.org/10.13031/aim.201700397, 10.13031/aim.201700397, DOI 10.13031/AIM.201700397]