Evolutionary Architecture Search for Generative Adversarial Networks Based on Weight Sharing

被引:20
作者
Xue, Yu [1 ]
Tong, Weinan [1 ]
Neri, Ferrante [2 ,3 ]
Chen, Peng [4 ,5 ]
Luo, Tao [6 ]
Zhen, Liangli [6 ]
Wang, Xiao [7 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Sch Software, Nanjing 210044, Peoples R China
[2] Nanjing Univ Informat Sci & Technol, Nanjing 210044, Peoples R China
[3] Univ Surrey, Dept Comp Sci, NICE Res Grp, Guildford GU2 7XH, England
[4] Natl Inst Adv Ind Sci & Technol, Tokyo 1350064, Japan
[5] RIKEN Ctr Computat Sci, Kobe 6500047, Japan
[6] Agcy Sci Technol & Res, Singapore, Singapore
[7] Oak Ridge Natl Lab, Oak Ridge, TN 37831 USA
基金
中国国家自然科学基金;
关键词
Training; Generative adversarial networks; Computer architecture; Generators; Network architecture; Search problems; Optimization; Evolutionary computation; generative adversarial networks (GANs); generative model; neural architecture search (NAS); GENETIC ALGORITHM; NEURAL-NETWORKS;
D O I
10.1109/TEVC.2023.3338371
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Generative adversarial networks (GANs) are a powerful generative technique but frequently face challenges with training stability. Network architecture plays a significant role in determining the final output of GANs, but designing a fine architecture demands extensive domain expertise. This article aims to address this issue by searching for high-performance generator's architectures through neural architecture search (NAS). The proposed approach, called evolutionary weight sharing GANs (EWSGAN), is based on weight sharing and comprises two steps. First, a supernet of the generator is trained using weight sharing. Second, a multiobjective evolutionary algorithm (MOEA) is employed to identify optimal subnets from the supernet. These subnets inherit weights directly from the supernet for fitness assessment. Two strategies are used to stabilize the training of the generator supernet: 1) a fair single-path sampling strategy and 2) a discarding strategy. Experimental results indicate that the architecture searched by our method achieved a new state-of-the-art among NAS-GAN methods with a Fr & eacute;chet inception distance (FID) of 9.09 and an inception score (IS) of 8.99 on the CIFAR-10 dataset. It also demonstrates competitive performance on the STL-10 dataset, achieving FID of 21.89 and IS of 10.51.
引用
收藏
页码:653 / 667
页数:15
相关论文
共 62 条
[1]  
[Anonymous], 2017, ADV NEURAL INFORM PR
[2]  
Arjovsky M, 2017, PR MACH LEARN RES, V70
[3]   A Survey on Evolutionary Computation for Computer Vision and Image Analysis: Past, Present, and Future Trends [J].
Bi, Ying ;
Xue, Bing ;
Mesejo, Pablo ;
Cagnoni, Stefano ;
Zhang, Mengjie .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2023, 27 (01) :5-25
[4]   Pros and cons of GAN evaluation measures: New developments [J].
Borji, Ali .
COMPUTER VISION AND IMAGE UNDERSTANDING, 2022, 215
[5]  
Brock A, 2019, Arxiv, DOI arXiv:1809.11096
[6]   CDE-GAN: Cooperative Dual Evolution-Based Generative Adversarial Network [J].
Chen, Shiming ;
Wang, Wenjie ;
Xia, Beihao ;
You, Xinge ;
Peng, Qinmu ;
Cao, Zehong ;
Ding, Weiping .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2021, 25 (05) :986-1000
[7]   Evolving Deep Convolutional Variational Autoencoders for Image Classification [J].
Chen, Xiangru ;
Sun, Yanan ;
Zhang, Mengjie ;
Peng, Dezhong .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2021, 25 (05) :815-829
[8]   FairNAS: Rethinking Evaluation Fairness of Weight Sharing Neural Architecture Search [J].
Chu, Xiangxiang ;
Zhang, Bo ;
Xu, Ruijun .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :12219-12228
[9]   COEGAN: Evaluating the Coevolution Effect in Generative Adversarial Networks [J].
Costa, Victor ;
Lourenco, Nuno ;
Correia, Joao ;
Machado, Penousal .
PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19), 2019, :374-382
[10]   A fast and elitist multiobjective genetic algorithm: NSGA-II [J].
Deb, K ;
Pratap, A ;
Agarwal, S ;
Meyarivan, T .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2002, 6 (02) :182-197