A Comparative Study on Crack Detection in Concrete Walls Using Transfer Learning Techniques

被引:30
作者
Philip, Remya Elizabeth [1 ]
Andrushia, A. Diana [1 ]
Nammalvar, Anand [2 ]
Gurupatham, Beulah Gnana Ananthi [3 ]
Roy, Krishanu [4 ]
机构
[1] Karunya Inst Technol & Sci, Dept Elect & Commun Engn, Coimbatore 641114, India
[2] Karunya Inst Technol & Sci, Dept Civil Engn, Coimbatore 641114, India
[3] Anna Univ, Coll Engn, Div Struct Engn, Guindy Campus, Chennai 600025, India
[4] Univ Waikato, Sch Engn, Hamilton 3216, New Zealand
关键词
transfer learning; crack detection; concrete wall; convolutional neural network; structural health monitoring; PAVEMENT;
D O I
10.3390/jcs7040169
中图分类号
TB33 [复合材料];
学科分类号
摘要
Structural cracks have serious repercussions on the safety, adaptability, and longevity of structures. Therefore, assessing cracks is an important parameter when evaluating the quality of concrete construction. As numerous cutting-edge automated inspection systems that exploit cracks have been developed, the necessity for individual/personal onsite inspection has reduced exponentially. However, these methods need to be improved in terms of cost efficiency and accuracy. The deep-learning-based assessment approaches for structural systems have seen a significant development noticed by the structural health monitoring (SHM) community. Convolutional neural networks (CNNs) are vital in these deep learning methods. Technologies such as convolutional neural networks hold promise for precise and accurate condition evaluation. Moreover, transfer learning enables users to use CNNs without needing a comprehensive grasp of algorithms or the capability to modify pre-trained networks for particular purposes. Within the context of this study, a thorough analysis of well-known pre-trained networks for classifying the cracks in buildings made of concrete is conducted. The classification performance of convolutional neural network designs such as VGG16, VGG19, ResNet 50, MobileNet, and Xception is compared to one another with the concrete crack image dataset. It is identified that the ResNet50-based classifier provided accuracy scores of 99.91% for training and 99.88% for testing. Xception architecture delivered the least performance, with training and test accuracy of 99.64% and 98.82%, respectively.
引用
收藏
页数:22
相关论文
共 53 条
[1]   A transfer learning approach for acoustic emission zonal localization on steel plate-like structure using numerical simulation and unsupervised domain adaptation [J].
Ai, Li ;
Zhang, Bin ;
Ziehl, Paul .
MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2023, 192
[2]   Structural crack detection using deep convolutional neural networks [J].
Ali, Raza ;
Chuah, Joon Huang ;
Abu Talip, Mohamad Sofian ;
Mokhtar, Norrima ;
Shoaib, Muhammad Ali .
AUTOMATION IN CONSTRUCTION, 2022, 133
[3]   RTS-ELM: an approach for saliency-directed image segmentation with ripplet transform [J].
Andrushia, A. Diana ;
Thangarajan, R. .
PATTERN ANALYSIS AND APPLICATIONS, 2020, 23 (01) :385-397
[4]   A novel approach for thermal crack detection and quantification in structural concrete using ripplet transform [J].
Andrushia, Diana A. ;
Anand, N. ;
Arulraj, Prince G. .
STRUCTURAL CONTROL & HEALTH MONITORING, 2020, 27 (11)
[5]  
Bush J., 2021, P EG ICE 2021 WORKSH, P421
[6]   Autonomous concrete crack detection using deep fully convolutional neural network [J].
Cao Vu Dung ;
Le Duc Anh .
AUTOMATION IN CONSTRUCTION, 2019, 99 :52-58
[7]  
Cheng HD, 2001, TRANSPORT RES REC, P119
[8]   Xception: Deep Learning with Depthwise Separable Convolutions [J].
Chollet, Francois .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :1800-1807
[9]   Automatic detection method of cracks from concrete surface imagery using two-step light gradient boosting machine [J].
Chun, Pang-jo ;
Izumi, Shota ;
Yamane, Tatsuro .
COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, 2021, 36 (01) :61-72
[10]   A new mobile convolutional neural network-based approach for pixel-wise road surface crack detection [J].
Dogan, Gurkan ;
Ergen, Burhan .
MEASUREMENT, 2022, 195