Lightweight and Elegant Data Reduction Strategies for Training Acceleration of Convolutional Neural Networks

被引:0
|
作者
Demidovskij, Alexander [1 ,2 ]
Tugaryov, Artyom [1 ]
Trutnev, Aleksei [1 ]
Kazyulina, Marina [1 ]
Salnikov, Igor [1 ]
Pavlov, Stanislav [1 ]
机构
[1] Huawei Russian Res Inst, NN AI Team, Ul Maksima Gorkogo 117, Nizhnii Novgorod 603006, Russia
[2] Natl Res Univ, Higher Sch Econ, Dept Informat Math & Comp Sci, Ul Bolshaya Pecherskaya 25-12, Nizhnii Novgorod 603155, Russia
关键词
deep learning training; training acceleration; convolutional neural networks; sample importance; dataset reduction;
D O I
10.3390/math11143120
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Due to industrial demands to handle increasing amounts of training data, lower the cost of computing one model at a time, and lessen the ecological effects of intensive computing resource consumption, the job of speeding the training of deep neural networks becomes exceedingly challenging. Adaptive Online Importance Sampling and IDS are two brand-new methods for accelerating training that are presented in this research. On the one hand, Adaptive Online Importance Sampling accelerates neural network training by lowering the number of forward and backward steps depending on how poorly a model can identify a given data sample. On the other hand, Intellectual Data Selection accelerates training by removing semantic redundancies from the training dataset and subsequently lowering the number of training steps. The study reports average 1.9x training acceleration for ResNet50, ResNet18, MobileNet v2 and YOLO v5 on a variety of datasets: CIFAR-100, CIFAR-10, ImageNet 2012 and MS COCO 2017, where training data are reduced by up to five times. Application of Adaptive Online Importance Sampling to ResNet50 training on ImageNet 2012 results in 2.37 times quicker convergence to 71.7% top-1 accuracy, which is within 5% of the baseline. Total training time for the same number of epochs as the baseline is reduced by 1.82 times, with an accuracy drop of 2.45 p.p. The amount of time required to apply Intellectual Data Selection to ResNet50 training on ImageNet 2012 is decreased by 1.27 times with a corresponding decline in accuracy of 1.12 p.p. Applying both methods to ResNet50 training on ImageNet 2012 results in 2.31 speedup with an accuracy drop of 3.5 p.p.
引用
收藏
页数:25
相关论文
共 50 条
  • [21] VOLUME SEGMENTATION USING CONVOLUTIONAL NEURAL NETWORKS WITH LIMITED TRAINING DATA
    Cheng, Hsueh-Chien
    Varshney, Amitabh
    2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 590 - 594
  • [22] Cross-Dataset Data Augmentation for Convolutional Neural Networks Training
    Gasparetto, Andrea
    Ressi, Dalila
    Bergamasco, Filippo
    Pistellato, Mara
    Cosmo, Luca
    Boschetti, Marco
    Ursella, Enrico
    Albarelli, Andrea
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 910 - 915
  • [23] Training Fully Convolutional Neural Networks for Lightweight, Non-Critical Instance Segmentation Applications
    Veganzones, Miguel
    Cisnal, Ana
    de la Fuente, Eusebio
    Fraile, Juan Carlos
    APPLIED SCIENCES-BASEL, 2024, 14 (23):
  • [24] Lightweight Convolutional Neural Networks for Vehicle Target Recognition
    Wang, Jintao
    Ji, Ping
    Xiao, Wen
    Ni, Tianwei
    Sun, Wei
    Zeng, Sheng
    2020 IEEE 5TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION ENGINEERING (IEEE ICITE 2020), 2020, : 245 - 248
  • [25] Lightweight convolutional neural networks for player detection and classification
    Lu, Keyu
    Chen, Jianhui
    Little, James J.
    He, Hangen
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2018, 172 : 77 - 87
  • [26] Dilated Convolutional Neural Networks for Lightweight Diacritics Restoration
    Csanady, Balint
    Lukacs, Andras
    LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 4253 - 4259
  • [27] Face Recognition Based on Lightweight Convolutional Neural Networks
    Liu, Wenting
    Zhou, Li
    Chen, Jie
    INFORMATION, 2021, 12 (05)
  • [28] Realtime Face Verification with Lightweight Convolutional Neural Networks
    Nhan Dam
    Vinh-Tiep Nguyen
    Do, Minh N.
    Anh-Duc Duong
    Minh-Triet Tran
    ADVANCES IN VISUAL COMPUTING, PT II (ISVC 2015), 2015, 9475 : 420 - 430
  • [29] Reduction of Multiplications in Convolutional Neural Networks
    Ali, Munawar
    Yin, Baoqun
    Kunar, Aakash
    Sheikh, Ali Muhammad
    Bilal, Hazrat
    PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7406 - 7411
  • [30] Towards lightweight convolutional neural networks for object detection
    Anisimov, Dmitriy
    Khanova, Tatiana
    2017 14TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL BASED SURVEILLANCE (AVSS), 2017,