PreAugNet: improve data augmentation for industrial defect classification with small-scale training data

被引:0
作者
Isack Farady
Chih-Yang Lin
Ming-Ching Chang
机构
[1] Yuan Ze University,Electrical Engineering
[2] Mercu Buana University,Electrical Engineering
[3] National Central University,Mechanical Engineering
[4] University at Albany,Computer Science
来源
Journal of Intelligent Manufacturing | 2024年 / 35卷
关键词
Data augmentation; Synthetic sample generation; CNN; Surface defect classification; Decision boundary; PreAugNet;
D O I
暂无
中图分类号
学科分类号
摘要
With the prevalence of deep learning and convolutional neural network (CNN), data augmentation is widely used for enriching training samples to gain model training improvement. Data augmentation is important when training samples are scarce. This work focuses on improving data augmentation for training an industrial steel surface defect classification network, where the performance is largely depending on the availability of high-quality training samples. It is very difficult to find a sufficiently large dataset for this application in real-world settings. When it comes to synthetic data augmentation, the performance is often degraded by incorrect class labels, and a large effort is required to generate high-quality samples. This paper introduces a novel off-line pre-augmentation network (PreAugNet) which acts as a class boundary classifier that can effectively screen the quality of the augmented samples and improve image augmentation. This PreAugNet can generate augmented samples and update decision boundaries via an independent support vector machine (SVM) classifier. New samples are automatically distributed and combined with the original data for training the target network. The experiments show that these new augmentation samples can improve classification without changing the target network architecture. The proposed method for steel surface defect inspection is evaluated on three real-world datasets: AOI steel defect dataset, MT, and NEU datasets. PreAugNet significantly increases the accuracy by 3.3% (AOI dataset), 6.25% (MT dataset) and 2.1% (NEU dataset), respectively.
引用
收藏
页码:1233 / 1246
页数:13
相关论文
共 187 条
  • [91] Bottou L(undefined)undefined undefined undefined undefined-undefined
  • [92] Bengio Y(undefined)undefined undefined undefined undefined-undefined
  • [93] Haffner P(undefined)undefined undefined undefined undefined-undefined
  • [94] Li C(undefined)undefined undefined undefined undefined-undefined
  • [95] Xu T(undefined)undefined undefined undefined undefined-undefined
  • [96] Zhu J(undefined)undefined undefined undefined undefined-undefined
  • [97] Zhang B(undefined)undefined undefined undefined undefined-undefined
  • [98] Liu MY(undefined)undefined undefined undefined undefined-undefined
  • [99] Tuzel O(undefined)undefined undefined undefined undefined-undefined
  • [100] Luo Q(undefined)undefined undefined undefined undefined-undefined