eVAE: Evolutionary Variational Autoencoder

被引:3
作者
Wu, Zhangkai [1 ]
Cao, Longbing [2 ]
Qi, Lei [3 ]
机构
[1] Univ Technol Sydney, Fac Engn & Informat Technol, Broadway, NSW 2008, Australia
[2] Macquarie Univ, Sch Comp, Data Sci Lab, DataX Res Ctr, Sydney, NSW 2109, Australia
[3] Southeast Univ, Comp Sci & Engn, Nanjing 210096, Peoples R China
基金
澳大利亚研究理事会;
关键词
Task analysis; Training; Fitting; Convergence; Tuning; Probabilistic logic; Deep learning; Evolutionary variational autoencoder (eVAE); variational autoencoder (VAE); variational genetic algorithm (VGA);
D O I
10.1109/TNNLS.2024.3359275
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Variational autoencoders (VAEs) are challenged by the imbalance between representation inference and task fitting caused by surrogate loss. To address this issue, existing methods adjust their balance by directly tuning their coefficients. However, these methods suffer from a tradeoff uncertainty, i.e., nondynamic regulation over iterations and inflexible hyperparameters for learning tasks. Accordingly, we make the first attempt to introduce an evolutionary VAE (eVAE), building on the variational information bottleneck (VIB) theory and integrative evolutionary neural learning. eVAE integrates a variational genetic algorithm (VGA) into VAE with variational evolutionary operators, including variational mutation (V-mutation), crossover, and evolution. Its training mechanism synergistically and dynamically addresses and updates the learning tradeoff uncertainty in the evidence lower bound (ELBO) without additional constraints and hyperparameter tuning. Furthermore, eVAE presents an evolutionary paradigm to tune critical factors of VAEs and addresses the premature convergence and random search problem in integrating evolutionary optimization into deep learning. Experiments show that eVAE addresses the KL-vanishing problem for text generation with low reconstruction loss, generates all the disentangled factors with sharp images, and improves image generation quality. eVAE achieves better disentanglement, generation performance, and generation-inference balance than its competitors. Code available at: https://github.com/amasawa/eVAE.
引用
收藏
页码:3288 / 3299
页数:12
相关论文
共 43 条
  • [1] Conservative Policy Construction Using Variational Autoencoders for Logged Data With Missing Values
    Abroshan, Mahed
    Yip, Kai Hou
    Tekin, Cem
    van der Schaar, Mihaela
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 6368 - 6378
  • [2] Alemi A., 2018, P INT C MACH LEARN S, P159
  • [3] Alemi A. A., 2017, Proc. ICLR
  • [4] Bowman S.R., 2016, P 20 SIGNLL C COMPUT, P10
  • [5] Bozkurt A, 2021, PR MACH LEARN RES, V130
  • [6] Burgess C. P., 2018, P NEURIPS
  • [7] Chen Ricky T. Q., 2018, Advances in Neural Information Processing Systems, V31, DOI DOI 10.48550/ARXIV.1806.07366
  • [8] Chen X, 2016, ADV NEUR IN, V29
  • [9] Neighborhood Geometric Structure-Preserving Variational Autoencoder for Smooth and Bounded Data Sources
    Chen, Xingyu
    Wang, Chunyu
    Lan, Xuguang
    Zheng, Nanning
    Zeng, Wenjun
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (08) : 3598 - 3611
  • [10] Self-adaptive genetic algorithms with simulated binary crossover
    Deb, K
    Beyer, HG
    [J]. EVOLUTIONARY COMPUTATION, 2001, 9 (02) : 197 - 221