Latent Coreset Sampling based Data-Free Continual Learning

被引:3
|
作者
Wang, Zhuoyi [1 ]
Li, Dingcheng [1 ]
Li, Ping [1 ]
机构
[1] Baidu Res, Cognit Comp Lab, Bellevue, WA 98004 USA
来源
PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022 | 2022年
关键词
Continual Learning; Data-free; Coreset Sampling; Latent representation;
D O I
10.1145/3511808.3557375
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Catastrophic forgetting poses a major challenge in continual learning where the old knowledge is forgotten when the model is updated on new tasks. Existing solutions tend to solve this challenge through generative models or exemplar-replay strategies. However, such methods may not alleviate the issue that the low-quality samples are generated or selected for the replay, which would directly reduce the effectiveness of the model, especially in the class imbalance, noise, or redundancy scenarios. Accordingly, how to select a suitable coreset during continual learning becomes significant in such setting. In this work, we propose a novel approach that leverages continual coreset sampling (CCS) to address these challenges. We aim to select the most representative subsets during each iteration. When the model is trained on new tasks, it closely approximates/matches the gradient of both the previous and current tasks with respect to the model parameters. This way, adaptation of the model to new datasets could be more efficient. Furthermore, different from the old data storage for maintaining the old knowledge, our approach choose to preserving them in the latent space. We augment the previous classes in the embedding space as the pseudo sample vectors from the old encoder output, strengthened by the joint training with selected new data. It could avoid data privacy invasions in a real-world application when we update the model. Our experiments validate the effectiveness of our proposed approach over various CV/NLP datasets under against current baselines, and we also indicate the obvious improvement of model adaptation and forgetting reduction in a data-free manner.
引用
收藏
页码:2078 / 2087
页数:10
相关论文
共 50 条
  • [11] R-DFCIL: Relation-Guided Representation Learning for Data-Free Class Incremental Learning
    Gao, Qiankun
    Zhao, Chen
    Ghanem, Bernard
    Zhang, Jian
    COMPUTER VISION, ECCV 2022, PT XXIII, 2022, 13683 : 423 - 439
  • [12] Data-Free Solution of Electromagnetic PDEs Using Neural Networks and Extension to Transfer Learning
    Bhardwaj, Shubhendu
    Gaire, Pawan
    IEEE TRANSACTIONS ON ANTENNAS AND PROPAGATION, 2022, 70 (07) : 5179 - 5188
  • [13] Beyond Prompt Learning: Continual Adapter for Efficient Rehearsal-Free Continual Learning
    Gao, Xinyuan
    Dong, Songlin
    He, Yuhang
    Wang, Qiang
    Gong, Yihong
    COMPUTER VISION - ECCV 2024, PT LXXXV, 2025, 15143 : 89 - 106
  • [14] A physics-informed neural network enhanced importance sampling (PINN-IS) for data-free reliability analysis
    Roy, Atin
    Chatterjee, Tanmoy
    Adhikari, Sondipon
    PROBABILISTIC ENGINEERING MECHANICS, 2024, 78
  • [15] A Data-Free Approach for Targeted Universal Adversarial Perturbation
    Wang, Xiaoyu
    Bai, Tao
    Zhao, Jun
    SCIENCE OF CYBER SECURITY, SCISEC 2021, 2021, 13005 : 126 - 138
  • [16] Continual learning with Bayesian compression for shared and private latent representations
    Yang, Yang
    Guo, Dandan
    Chen, Bo
    Hu, Dexiu
    NEURAL NETWORKS, 2025, 185
  • [17] FedINC: An Exemplar-Free Continual Federated Learning Framework with Small Labeled Data
    Deng, Yongheng
    Yue, Sheng
    Wang, Tuowei
    Wang, Guanbo
    Ren, Ju
    Zhang, Yaoxue
    PROCEEDINGS OF THE 21ST ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS, SENSYS 2023, 2023, : 56 - 69
  • [18] EXEMPLAR-FREE ONLINE CONTINUAL LEARNING
    He, Jiangpeng
    Zhu, Fengqing
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 541 - 545
  • [19] SpaceNet: Make Free Space for Continual Learning
    Sokar, Ghada
    Mocanu, Decebal Constantin
    Pechenizkiy, Mykola
    NEUROCOMPUTING, 2021, 439 : 1 - 11
  • [20] Principal Gradient Direction and Confidence Reservoir Sampling for Continual Learning
    Chen, Zhiyi
    Lin, Tong
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT II, 2021, 12892 : 421 - 432