Split-Level Evolutionary Neural Architecture Search With Elite Weight Inheritance

被引:20
作者
Huang, Junhao [1 ]
Xue, Bing [1 ]
Sun, Yanan [2 ]
Zhang, Mengjie [1 ]
Yen, Gary G. [3 ]
机构
[1] Victoria Univ Wellington, Sch Engn & Comp Sci, Wellington 6140, New Zealand
[2] Sichuan Univ, Sch Comp Sci, Chengdu 610065, Peoples R China
[3] Oklahoma State Univ, Sch Elect & Comp Engn, Stillwater, OK 74078 USA
基金
中国国家自然科学基金;
关键词
Deep neural networks; image classification; neural architecture search (NAS); particle swarm optimization (PSO); NETWORKS; ALGORITHM;
D O I
10.1109/TNNLS.2023.3269816
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural architecture search (NAS) has recently gained extensive interest in the deep learning community because of its great potential in automating the construction process of deep models. Among a variety of NAS approaches, evolutionary computation (EC) plays a pivotal role with its merit of gradient-free search ability. However, a massive number of the current EC-based NAS approaches evolve neural architectures in an absolutely discrete manner, which makes it tough to flexibly handle the number of filters for each layer, since they often reduce it to a limit set rather than searching for all possible values. Moreover, EC-based NAS methods are often criticized for their inefficiency in performance evaluation, which usually requires laborious full training for hundreds of candidate architectures generated. To address the inflexible search issue on the number of filters, this work proposes a split-level particle swarm optimization (PSO) approach. Each dimension of the particle is subdivided into an integer part and a fractional part, encoding the configurations of the corresponding layer, and the number of filters within a large range, respectively. In addition, the evaluation time is greatly saved by a novel elite weight inheritance method based on an online updating weight pool, and a customized fitness function considering multiple objectives is developed to well control the complexity of the searched candidate architectures. The proposed method, termed split-level evolutionary NAS (SLE-NAS), is computationally efficient, and outperforms many state-of-the-art peer competitors at much lower complexity across three popular image classification benchmark datasets.
引用
收藏
页码:13523 / 13537
页数:15
相关论文
共 66 条
[1]  
Brown TB, 2020, Arxiv, DOI [arXiv:2005.14165, DOI 10.48550/ARXIV.2005.14165]
[2]  
Baker B.D., 2017, MONEY MATTERS SCH
[3]   Defining a standard for particle swarm optimization [J].
Bratton, Daniel ;
Kennedy, James .
2007 IEEE SWARM INTELLIGENCE SYMPOSIUM, 2007, :120-+
[4]  
Cai H., 2019, INT C LEARN REPR
[5]   Xception: Deep Learning with Depthwise Separable Convolutions [J].
Chollet, Francois .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :1800-1807
[6]  
DeVries T, 2017, Arxiv, DOI arXiv:1708.04552
[7]   NAP: Neural architecture search with pruning [J].
Ding, Yadong ;
Wu, Yu ;
Huang, Chengyue ;
Tang, Siliang ;
Wu, Fei ;
Yang, Yi ;
Zhu, Wenwu ;
Zhuang, Yueting .
NEUROCOMPUTING, 2022, 477 :85-95
[8]  
Elsken T., 2019, PROC INT C LEARN REP
[9]  
Elsken T, 2019, J MACH LEARN RES, V20
[10]  
Howard AG, 2017, Arxiv, DOI [arXiv:1704.04861, 10.48550/arXiv.1704.04861]