Automatic visual defects inspection of wind turbine blades via YOLO-based small object detection approach

被引:58
作者
Qiu, Zifeng [1 ]
Wang, Shuangxin [1 ]
Zeng, Zhaoxi [2 ]
Yu, Dingli [3 ]
机构
[1] Beijing Jiaotong Univ, Sch Mech Elect & Control Engn, Beijing, Peoples R China
[2] Univ Penn, Grad Sch Educ, Philadelphia, PA 19104 USA
[3] Liverpool John Moores Univ, Control Syst Ctr, Sch Engn, Liverpool, Merseyside, England
基金
中国国家自然科学基金;
关键词
wind turbine blades; small object; automatic visual detection; convolutional neural network; you only look once;
D O I
10.1117/1.JEI.28.4.043023
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Regular inspection of wind turbine blades (WTBs), especially the detection of tiny defects, is necessary to maintain safe operation of wind turbine systems. However, current detections are inefficient and subjective because they are conducted merely by human inspectors. An autonomous visual inspection system is proposed in this paper for WTBs, in which a deep learning framework is developed by combining the convolutional neural network (CNN) and the you only look once (YOLO) model. To achieve practically acceptable detection accuracy for small-sized defects on the WTBs, a YOLO-based small object detection approach (YSODA) using a multiscale feature pyramid is proposed by amalgamating features of more layers. To evaluate the proposed YSODA, a database including 23,807 images labeled for three types of defect-crack, oil pollution, and sand inclusion, is developed. Then, the YSODA is with its architecture modified, and is trained, validated, and tested using the images from the database to provide autonomous and accurate visual inspection. After training and testing, resulting detection accuracy reaches 92.7%, 90.7%, and 90.3% for the three types of defect with the average accuracy being 91.3%. The robustness of the trained YSODA is demonstrated and verified in detecting small-sized defects. It is also compared with that of the traditional CNN-based and machine learning methods by applying to a real WTB system, which proved that the proposed YSODA is superior to existing approaches in terms of detection accuracy and reliability. (C) 2019 SPIE and IS&T
引用
收藏
页数:11
相关论文
共 20 条
[1]   Multi-view longitudinal CNN for multiple sclerosis lesion segmentation [J].
Birenbaum, Ariel ;
Greenspan, Hayit .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2017, 65 :111-118
[2]   Autonomous Structural Visual Inspection Using Region-Based Deep Learning for Detecting Multiple Damage Types [J].
Cha, Young-Jin ;
Choi, Wooram ;
Suh, Gahyun ;
Mahmoudkhani, Sadegh ;
Buyukozturk, Oral .
COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, 2018, 33 (09) :731-747
[3]  
Davis A, 2018, EUR J CANCER PREV, V27, P1, DOI [10.1097/CEJ.0000000000000244, 10.1097/cej.0000000000000244]
[4]   Thermographic non-destructive inspection of wind turbine blades using unmanned aerial systems [J].
Galleguillos, C. ;
Zorrilla, A. ;
Jimenez, A. ;
Diaz, L. ;
Montiano, A. L. ;
Barroso, M. ;
Viguria, A. ;
Lasagni, F. .
PLASTICS RUBBER AND COMPOSITES, 2015, 44 (03) :98-103
[5]   Damage assessment of wind turbine blade under static loading test using acoustic emission [J].
Han, Byeong-Hee ;
Yoon, Dong-Jin ;
Huh, Yong-Hak ;
Lee, Young-Shin .
JOURNAL OF INTELLIGENT MATERIAL SYSTEMS AND STRUCTURES, 2014, 25 (05) :621-630
[6]  
Joosse P. A., 2012, J SOL ENERGY ENG, V124, P401
[7]   ImageNet Classification with Deep Convolutional Neural Networks [J].
Krizhevsky, Alex ;
Sutskever, Ilya ;
Hinton, Geoffrey E. .
COMMUNICATIONS OF THE ACM, 2017, 60 (06) :84-90
[8]   Classification and Segmentation of Satellite Orthoimagery Using Convolutional Neural Networks [J].
Langkvist, Martin ;
Kiselev, Andrey ;
Alirezaie, Marjan ;
Loutfi, Amy .
REMOTE SENSING, 2016, 8 (04)
[9]   Transformation algorithm of wind turbine blade moment signals for blade condition monitoring [J].
Lee, Jae-Kyung ;
Park, Joon-Young ;
Oh, Ki-Yong ;
Ju, Seung-Hwan ;
Lee, Jun-Shin .
RENEWABLE ENERGY, 2015, 79 :209-218
[10]   Detection and analysis of large-scale WT blade surface cracks based on UAV-taken images [J].
Peng, Lin ;
Liu, Jun .
IET IMAGE PROCESSING, 2018, 12 (11) :2059-2064