Research on Road Object Detection Model Based on YOLOv4 of Autonomous Vehicle

被引:8
作者
Wang, Penghui [1 ]
Wang, Xufei [1 ]
Liu, Yifan [2 ]
Song, Jeongyoung [3 ]
机构
[1] Shaanxi Univ Sci & Technol, Dept Mech Engn, Hanzhong 723000, Peoples R China
[2] Sanmenxia Coll Social Adm, Dept New Energy, Sanmenxia 472000, Peoples R China
[3] Pai Chai Univ, Dept Comp Engn, Daejeon 35345, South Korea
关键词
Object detection; YOLOv4; Mobilenetv2; SENet; EIOU; ALGORITHM; RECOGNITION;
D O I
10.1109/ACCESS.2024.3351771
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The YOLOv4 network is widely used in object detection tasks as a representative network, but there is also the problem that the complexity of the network model affects the detection speed. In this paper, we propose an improved MV2_S_YE object detection algorithm based on the YOLOv4 network to improve the detection accuracy while increasing the road object detection speed. Firstly, the backbone network CSPDarknet53 of the YOLOv4 network is replaced by the Mobilenetv2 network to reduce the number of parameters of the network; secondly, the channel attention mechanism is introduced, and the SENet module is embedded in the structure of the PANet to optimize the object detection accuracy; finally, the EIOU loss function is used to replace the CIOU loss function to improve the object detection accuracy further. The MV2_S_YE network is obtained and tested on Pascal VOC, Udacity, and KAIST datasets. To evaluate our approach, we compared MV2-S-YE with YOLOv4, YOLOv4-tiny, YOLOv7-tiny and YOLOv8s. The results show that MV2-S-YE mAP@0.5 achieves 80.9%, 66.7%, and 94.8% on the VOC2007, Udacity, and KAIST test sets, respectively, and is higher than YOLOv8s on both the Udacity and KAIST test sets. On the VOC2007 test set MV2-S-YE achieves a detection speed of 45FPS which is higher than YOLOv8s.
引用
收藏
页码:8198 / 8206
页数:9
相关论文
共 50 条
[21]   Automatic Ship Object Detection Model Based on YOLOv4 with Transformer Mechanism in Remote Sensing Images [J].
Sun, Bowen ;
Wang, Xiaofeng ;
Oad, Ammar ;
Pervez, Amjad ;
Dong, Feng .
APPLIED SCIENCES-BASEL, 2023, 13 (04)
[22]   Improved Vision-Based Vehicle Detection and Classification by Optimized YOLOv4 [J].
Zhao, Jingyi ;
Hao, Shengnan ;
Dai, Chenxu ;
Zhang, Haiyang ;
Zhao, Li ;
Ji, Zhanlin ;
Ganchev, Ivan .
IEEE ACCESS, 2022, 10 :8590-8603
[23]   Enhanced Object Detection Based on YoloV8+EAC for an Autonomous Vehicle [J].
Widiyanto, Svam ;
Wu, Hsiu-Ming .
2024 SICE FESTIVAL WITH ANNUAL CONFERENCE, SICE FES 2024, 2024, :1295-1300
[24]   Performance Evaluation of YOLOv4 for Instant Object Detection in UAVs [J].
Keskinoglu, Gul Bahar ;
Yesilkaya, Nimet Seleri ;
Bayraktar, Irem ;
Kose, Ercan .
32ND IEEE SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU 2024, 2024,
[25]   Evaluating YOLOv4 and YOLOv5 for Enhanced Object Detection in UAV-Based Surveillance [J].
Alhassan, Mugtaba Abdalrazig Mohamed ;
Yilmaz, Ersen .
PROCESSES, 2025, 13 (01)
[26]   Surrogate Object Detection Explainer (SODEx) with YOLOv4 and LIME [J].
Sejr, Jonas Herskind ;
Schneider-Kamp, Peter ;
Ayoub, Naeem .
MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2021, 3 (03) :662-671
[27]   Design and Implementation of Industrial Accident Detection Model Based on YOLOv4 [J].
Lee, Taejun ;
Woo, Keanseb ;
Kim, Panyoung ;
Jung, Hoekyung .
APPLIED SCIENCES-BASEL, 2023, 13 (18)
[28]   A fused network based on PReNet and YOLOv4 for traffic object detection in rainy environment [J].
Chen T. ;
Yao D.-C. ;
Gao T. ;
Qiu H.-H. ;
Guo C.-X. ;
Liu Z.-W. ;
Li Y.-H. ;
Bian H.-Y. .
Jiaotong Yunshu Gongcheng Xuebao/Journal of Traffic and Transportation Engineering, 2022, 22 (03) :225-237
[29]   Object Detection Method Based on Improved YOLOv4 Network for Remote Sensing Images [J].
Xiao Zhenjiu ;
Yang Yueying ;
Kong Xiangxu .
LASER & OPTOELECTRONICS PROGRESS, 2023, 60 (06)
[30]   Pedestrian and Vehicle Detection Based on Pruning YOLOv4 with Cloud-Edge Collaboration [J].
Wang, Huabin ;
Mo, Ruichao ;
Chen, Yuping ;
Lin, Weiwei ;
Xu, Minxian ;
Liu, Bo .
CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2023, 137 (02) :2025-2047