Real-Time Classification of Chicken Parts in the Packaging Process Using Object Detection Models Based on Deep Learning

被引:0
作者
Sahin, Dilruba [1 ]
Torkul, Orhan [1 ]
Sisci, Merve [2 ,3 ]
Diren, Deniz Demircioglu [3 ]
Yilmaz, Recep [4 ]
Kibar, Alpaslan [5 ]
机构
[1] Sakarya Univ, Ind Engn Dept, TR-54050 Sakarya, Turkiye
[2] Kutahya Dumlupinar Univ, Ind Engn Dept, TR-43300 Kutahya, Turkiye
[3] Sakarya Univ, Dept Informat Syst & Technol, TR-54050 Sakarya, Turkiye
[4] Sakarya Univ, Business Sch, TR-54050 Sakarya, Turkiye
[5] Sakarya Univ, Dept Management Informat Syst, TR-54050 Sakarya, Turkiye
关键词
chicken parts; deep learning; image processing; object detection; reducing waste and costs; RT-DETR; YOLOv8; FRAMEWORK; HEALTH;
D O I
10.3390/pr13041005
中图分类号
TQ [化学工业];
学科分类号
0817 ;
摘要
Chicken meat plays an important role in the healthy diets of many people and has a large global trade volume. In the chicken meat sector, in some production processes, traditional methods are used. Traditional chicken part sorting methods are often manual and time-consuming, especially during the packaging process. This study aimed to identify and classify the chicken parts for their input during the packaging process with the highest possible accuracy and speed. For this purpose, deep-learning-based object detection models were used. An image dataset was developed for the classification models by collecting the image data of different chicken parts, such as legs, breasts, shanks, wings, and drumsticks. The models were trained by the You Only Look Once version 8 (YOLOv8) algorithm variants and the Real-Time Detection Transformer (RT-DETR) algorithm variants. Then, they were evaluated and compared based on precision, recall, F1-Score, mean average precision (mAP), and Mean Inference Time per frame (MITF) metrics. Based on the obtained results, the YOLOv8s model outperformed the other models developed with other YOLOv8 versions and the RT-DETR algorithm versions by obtaining values of 0.9969, 0.9950, and 0.9807 for the F1-score, mAP@0.5, and mAP@0.5:0.95, respectively. It has been proven suitable for real-time applications with an MITF value of 10.3 ms/image.
引用
收藏
页数:21
相关论文
共 84 条
[71]  
Turkish Statistical Institute (TURKSTAT), About us
[72]   CDE-DETR: A REAL-TIME END-TO-END HIGH-RESOLUTION REMOTE SENSING OBJECT DETECTION METHOD BASED ON RT-DETR [J].
Wang, Anrui ;
Xu, Yang ;
Wang, He ;
Wu, Zebin ;
Wei, Zhihui .
2024 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2024), 2024, :8090-8094
[73]   Improved RT-DETR and its application to fruit ripeness detection [J].
Wu, Mengyang ;
Qiu, Ya ;
Wang, Wenying ;
Su, Xun ;
Cao, Yuhao ;
Bai, Yun .
FRONTIERS IN PLANT SCIENCE, 2025, 16
[74]   A Novel Chicken Meat Quality Evaluation Method Based on Color Card Localization and Color Correction [J].
You, Mengbo ;
Liu, Jiahao ;
Zhang, Jian ;
Xv, Mingdong ;
He, Dongjian .
IEEE ACCESS, 2020, 8 :170093-170100
[75]   Railway rutting defects detection based on improved RT-DETR [J].
Yu, Chenghai ;
Chen, Xiangwei .
JOURNAL OF REAL-TIME IMAGE PROCESSING, 2024, 21 (04)
[76]   Logistics automation control based on machine learning algorithm [J].
Yu, Xiaomo ;
Liao, Xiaoping ;
Li, Wenjing ;
Liu, Xinquan ;
Tao, Zhang .
CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2019, 22 (Suppl 6) :14003-14011
[77]   An enhancement algorithm for head characteristics of caged chickens detection based on cyclic consistent migration neural network [J].
Yu, Zhenwei ;
Wan, Liqing ;
Yousaf, Khurram ;
Lin, Hai ;
Zhang, Ji ;
Jiao, Hongchao ;
Yan, Geqi ;
Song, Zhanhua ;
Tian, Fuyang .
POULTRY SCIENCE, 2024, 103 (06)
[78]  
Zhafran Faishal, 2019, 2019 International Electronics Symposium (IES). Proceedings, P516, DOI 10.1109/ELECSYM.2019.8901664
[79]   YOLO-Drone: An Optimized YOLOv8 Network for Tiny UAV Object Detection [J].
Zhai, Xianxu ;
Huang, Zhihua ;
Li, Tao ;
Liu, Hanzheng ;
Wang, Siyuan .
ELECTRONICS, 2023, 12 (17)
[80]   RS-DETR: An Improved Remote Sensing Object Detection Model Based on RT-DETR [J].
Zhang, Hao ;
Ma, Zheng ;
Li, Xiang .
APPLIED SCIENCES-BASEL, 2024, 14 (22)