Online Hyper-parameter Learning for Auto-Augmentation Strategy

被引:59
|
作者
Lin, Chen [1 ]
Guo, Minghao [1 ]
Li, Chuming [1 ]
Yuan, Xin [1 ]
Wu, Wei [1 ]
Yan, Junjie [1 ]
Lin, Dahua [2 ]
Ouyang, Wanli [3 ]
机构
[1] SenseTime Grp Ltd, Hong Kong, Peoples R China
[2] Chinese Univ Hong Kong, Hong Kong, Peoples R China
[3] Univ Sydney, SenseTime Comp Vis Res Grp, Sydney, NSW, Australia
关键词
D O I
10.1109/ICCV.2019.00668
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Data augmentation is critical to the success of modern deep learning techniques. In this paper, we propose Online Hyper-parameter Learning for Auto-Augmentation (OHL-Auto-Aug), an economical solution that learns the augmentation policy distribution along with network training. Unlike previous methods on auto-augmentation that search augmentation strategies in an offline manner, our method formulates the augmentation policy as a parameterized probability distribution, thus allowing its parameters to be optimized jointly with network parameters. Our proposed OHL-Auto-Aug eliminates the need of re-training and dramatically reduces the cost of the overall search process, while establishes significantly accuracy improvements over baseline models. On both CIFAR-10 and ImageNet, our method achieves remarkable on search accuracy, i.e. 60x faster on CIFAR-10 and 24x faster on ImageNet, while maintaining competitive accuracies.
引用
收藏
页码:6578 / 6587
页数:10
相关论文
共 50 条
  • [1] ONLINE HYPER-PARAMETER TUNING FOR THE CONTEXTUAL BANDIT
    Bouneffouf, Djallel
    Claeys, Emmanuelle
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3445 - 3449
  • [2] Federated learning with hyper-parameter optimization
    Kundroo, Majid
    Kim, Taehong
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2023, 35 (09)
  • [3] Auto-augmentation: ought it to be done?
    Suzanne Lawther
    David Marshall
    Alan Bailie
    Stephen Brown
    Cerebrospinal Fluid Research, 4 (Suppl 1):
  • [4] Nonlinear Set Membership Regression with Adaptive Hyper-Parameter Estimation for Online Learning and Control
    Calliess, Jan-Peter
    Roberts, Stephen
    Rasmussen, Carl
    Maciejowski, Jan
    2018 EUROPEAN CONTROL CONFERENCE (ECC), 2018, : 3167 - 3172
  • [5] An Experimental Study on Hyper-parameter Optimization for Stacked Auto-Encoders
    Sun, Yanan
    Xue, Bing
    Zhang, Mengjie
    Yen, Gary G.
    2018 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2018, : 638 - 645
  • [6] An efficient hyper-parameter optimization method for supervised learning
    Shi, Ying
    Qi, Hui
    Qi, Xiaobo
    Mu, Xiaofang
    APPLIED SOFT COMPUTING, 2022, 126
  • [7] LAPAROSCOPIC RETROPUBIC AUTO-AUGMENTATION OF THE BLADDER
    MCDOUGALL, EM
    CLAYMAN, RV
    FIGENSHAU, RS
    PEARLE, MS
    JOURNAL OF UROLOGY, 1995, 153 (01): : 123 - 126
  • [8] Automatic CNN Compression Based on Hyper-parameter Learning
    Tian, Nannan
    Liu, Yong
    Wang, Weiping
    Meng, Dan
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [9] HYPER-PARAMETER LEARNING FOR SPARSE STRUCTURED PROBABILISTIC MODELS
    Shpakova, Tatiana
    Bach, Francis
    Davies, Mike
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3347 - 3351
  • [10] Simultaneous Salvage Auto-augmentation: Contemporary Strategy for Management of the Breast Explantation Patient
    Kirwan, Laurence
    Wazir, Umar
    Mokbel, Kefah
    PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN, 2023, 11 (03) : E4860