Adapt & Align: Continual Learning with Generative Models' Latent Space Alignment

被引:0
作者
Deja, Kamil [1 ,2 ]
Cywinski, Bartosz [1 ,2 ]
Rybarczyk, Jan [1 ]
Trzcinski, Tomasz [1 ,2 ,3 ]
机构
[1] Warsaw Univ Technol, pl Politech 1, PL-00661 Warsaw, Poland
[2] Ideas Res Inst, ul Krolewska 27, PL-00060 Warsaw, Poland
[3] Tooploox, Teczowa 7, PL-53601 Wroclaw, Poland
关键词
Continual learning; Generative modeling; Vae; Gan;
D O I
10.1016/j.neucom.2025.130748
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Motivation: Neural networks suffer from abrupt loss in performance when retrained with additional data from different distributions. At the same time, training with additional data without access to the previous examples rarely improves the model's performance. Methods: We propose Adapt & Align, a novel continual learning framework that leverages generative models to align their latent representations across tasks. The approach is divided into two phases: focil Traffing Train I gencrative model (eg, a Vaflational Autbencoder TVA) ora Gellertive Adversarial Network (GAN)) on the current task to capture task-specific features. Global Training: Use a translator network to map these task-specific latent representations into a unified global latent space, thereby facilitating both forward and backward knowledge transfer. Results: Experiments on benchmark datasets (c.g., MNIST, Omniglot, CIFAR, CelebA) as well as real-world application for particle simulation at CERN demonstrate that Adapt & Align mitigates catastrophic forgetting and improves generation quality as indicated by metrics such as Fr & eacute;chet Inception Distance (FID), distribution precision and recall, or accuracy for the downstream classification task. Ablation studies confirm the critical role of cach component.
引用
收藏
页数:17
相关论文
共 74 条
[11]  
Dhariwal P, 2021, ADV NEUR IN, V34
[12]  
Egorov Evgenii, 2021, ADV NEUR IN, V34
[13]   Catastrophic forgetting in connectionist networks [J].
French, RM .
TRENDS IN COGNITIVE SCIENCES, 1999, 3 (04) :128-135
[14]  
Gao R., 2023, P INT C MACH LEARN, P10744
[15]  
Goodfellow IJ, 2014, ADV NEUR IN, V27, P2672
[16]  
Grathwohl W., 2019, INT C LEARN REPR
[17]   Embracing Change: Continual Learning in Deep Neural Networks [J].
Hadsell, Raia ;
Rao, Dushyant ;
Rusu, Andrei A. ;
Pascanu, Razvan .
TRENDS IN COGNITIVE SCIENCES, 2020, 24 (12) :1028-1040
[18]   Introspective GAN: Learning to grow a GAN for incremental generation and classification [J].
He, Chen ;
Wang, Ruiping ;
Shan, Shiguang ;
Chen, Xilin .
PATTERN RECOGNITION, 2024, 151
[19]  
Heusel M, 2017, ADV NEUR IN, V30
[20]  
Hu J.E., 2021, INT C LEARN REPR