A Gradient-Guided Evolutionary Neural Architecture Search

被引:6
|
作者
Xue, Yu [1 ]
Han, Xiaolong [1 ]
Neri, Ferrante [2 ]
Qin, Jiafeng [1 ]
Pelusi, Danilo [3 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Sch Software, Nanjing 210044, Peoples R China
[2] Univ Surrey, Dept Comp Sci, Nat Inspired Comp & Engn Res Grp, Guildford GU2 7XH, England
[3] Univ Teramo, Fac Commun Sci, I-64100 Teramo, Italy
基金
中国国家自然科学基金;
关键词
Computer architecture; Microprocessors; Search problems; Couplings; Evolutionary computation; Encoding; Statistics; gradient optimization; image classification; neural architecture search (NAS);
D O I
10.1109/TNNLS.2024.3371432
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural architecture search (NAS) is a popular method that can automatically design deep neural network structures. However, designing a neural network using NAS is computationally expensive. This article proposes a gradient-guided evolutionary NAS (GENAS) to design convolutional neural networks (CNNs) for image classification. GENAS is a hybrid algorithm that combines evolutionary global and local search operators to evolve a population of subnets sampled from a supernet. Each candidate architecture is encoded as a table describing which operations are associated with the edges between nodes signifying feature maps. Besides, evolutionary optimization uses novel crossover and mutation operators to manipulate the subnets using the proposed tabular encoding. Every n generations, the candidate architectures undergo a local search inspired by differentiable NAS. GENAS is designed to overcome the limitations of both evolutionary and gradient descent NAS. This algorithmic structure enables the performance assessment of the candidate architecture without retraining, thus limiting the NAS calculation time. Furthermore, subnet individuals are decoupled during evaluation to prevent strong coupling of operations in the supernet. The experimental results indicate that the searched structures achieve test errors of 2.45%, 16.86%, and 23.9% on CIFAR-10/100/ImageNet datasets and it costs only 0.26 GPU days on a graphic card. GENAS can effectively expedite the training and evaluation processes and obtain high-performance network structures.
引用
收藏
页码:1 / 13
页数:13
相关论文
共 50 条
  • [41] Action Command Encoding for Surrogate-Assisted Neural Architecture Search
    Tian, Ye
    Peng, Shichen
    Yang, Shangshang
    Zhang, Xingyi
    Tan, Kay Chen
    Jin, Yaochu
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2022, 14 (03) : 1129 - 1142
  • [42] Neural-Architecture-Search-Based Multiobjective Cognitive Automation System
    Wang, Eric Ke
    Xu, Ship Peng
    Chen, Chien-Ming
    Kumar, Neeraj
    IEEE SYSTEMS JOURNAL, 2021, 15 (02): : 2918 - 2925
  • [43] Neural Architecture Search Benchmarks: Insights and Survey
    Chitty-Venkata, Krishna Teja
    Emani, Murali
    Vishwanath, Venkatram
    Somani, Arun K.
    IEEE ACCESS, 2023, 11 : 25217 - 25236
  • [44] Neural Architecture Search via Proxy Validation
    Li, Yanxi
    Dong, Minjing
    Wang, Yunhe
    Xu, Chang
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (06) : 7595 - 7610
  • [45] Subarchitecture Ensemble Pruning in Neural Architecture Search
    Bian, Yijun
    Song, Qingquan
    Du, Mengnan
    Yao, Jun
    Chen, Huanhuan
    Hu, Xia
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (12) : 7928 - 7936
  • [46] Evolutionary neural architecture search based on efficient CNN models population for image classification
    Termritthikun, Chakkrit
    Jamtsho, Yeshi
    Muneesawang, Paisarn
    Zhao, Jia
    Lee, Ivan
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (16) : 23917 - 23943
  • [47] Evolutionary neural architecture search based on efficient CNN models population for image classification
    Chakkrit Termritthikun
    Yeshi Jamtsho
    Paisarn Muneesawang
    Jia Zhao
    Ivan Lee
    Multimedia Tools and Applications, 2023, 82 : 23917 - 23943
  • [48] Evolutionary multi-objective neural architecture search via depth equalization supernet
    Zou, Juan
    Liu, Yang
    Liu, Yuan
    Xia, Yizhang
    NEUROCOMPUTING, 2025, 633
  • [49] Multi-Objective Evolutionary Neural Architecture Search with Weight-Sharing Supernet
    Liang, Junchao
    Zhu, Ke
    Li, Yuan
    Li, Yun
    Gong, Yuejiao
    APPLIED SCIENCES-BASEL, 2024, 14 (14):
  • [50] Neural Architecture Search Based on a Multi-Objective Evolutionary Algorithm With Probability Stack
    Xue, Yu
    Chen, Chen
    Slowik, Adam
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2023, 27 (04) : 778 - 786