Latent Coreset Sampling based Data-Free Continual Learning

被引:3
|
作者
Wang, Zhuoyi [1 ]
Li, Dingcheng [1 ]
Li, Ping [1 ]
机构
[1] Baidu Res, Cognit Comp Lab, Bellevue, WA 98004 USA
来源
PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022 | 2022年
关键词
Continual Learning; Data-free; Coreset Sampling; Latent representation;
D O I
10.1145/3511808.3557375
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Catastrophic forgetting poses a major challenge in continual learning where the old knowledge is forgotten when the model is updated on new tasks. Existing solutions tend to solve this challenge through generative models or exemplar-replay strategies. However, such methods may not alleviate the issue that the low-quality samples are generated or selected for the replay, which would directly reduce the effectiveness of the model, especially in the class imbalance, noise, or redundancy scenarios. Accordingly, how to select a suitable coreset during continual learning becomes significant in such setting. In this work, we propose a novel approach that leverages continual coreset sampling (CCS) to address these challenges. We aim to select the most representative subsets during each iteration. When the model is trained on new tasks, it closely approximates/matches the gradient of both the previous and current tasks with respect to the model parameters. This way, adaptation of the model to new datasets could be more efficient. Furthermore, different from the old data storage for maintaining the old knowledge, our approach choose to preserving them in the latent space. We augment the previous classes in the embedding space as the pseudo sample vectors from the old encoder output, strengthened by the joint training with selected new data. It could avoid data privacy invasions in a real-world application when we update the model. Our experiments validate the effectiveness of our proposed approach over various CV/NLP datasets under against current baselines, and we also indicate the obvious improvement of model adaptation and forgetting reduction in a data-free manner.
引用
收藏
页码:2078 / 2087
页数:10
相关论文
共 50 条
  • [1] Memory efficient data-free distillation for continual learning
    Li, Xiaorong
    Wang, Shipeng
    Sun, Jian
    Xu, Zongben
    PATTERN RECOGNITION, 2023, 144
  • [2] Variational Data-Free Knowledge Distillation for Continual Learning
    Li, Xiaorong
    Wang, Shipeng
    Sun, Jian
    Xu, Zongben
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (10) : 12618 - 12634
  • [3] A novel data-free continual learning method with contrastive reversion
    Wu, Chu
    Xie, Runshan
    Wang, Shitong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (02) : 505 - 518
  • [4] A novel data-free continual learning method with contrastive reversion
    Chu Wu
    Runshan Xie
    Shitong Wang
    International Journal of Machine Learning and Cybernetics, 2024, 15 : 505 - 518
  • [5] Communication Efficient Coreset Sampling for Distributed Learning
    Fan, Yawen
    Li, Husheng
    2018 IEEE 19TH INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (SPAWC), 2018, : 76 - 80
  • [6] Latent spectral regularization for continual learning
    Frascaroli, Emanuele
    Benaglia, Riccardo
    Boschini, Matteo
    Moschella, Luca
    Fiorini, Cosimo
    Rodola, Emanuele
    Calderara, Simone
    PATTERN RECOGNITION LETTERS, 2024, 184 : 119 - 125
  • [7] An Energy Sampling Replay-Based Continual Learning Framework
    Zhang, Xingzhong
    Chuah, Joon Huang
    Loo, Chu Kiong
    Wermter, Stefan
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT II, 2024, 15017 : 17 - 30
  • [8] Projected Latent Distillation for Data-Agnostic Consolidation in distributed continual learning
    Carta, Antonio
    Cossu, Andrea
    Lomonaco, Vincenzo
    Bacciu, Davide
    van de Weijer, Joost
    NEUROCOMPUTING, 2024, 598
  • [9] Information Bottleneck Based Data Correction in Continual Learning
    Chen, Shuai
    Zhang, Mingyi
    Zhang, Junge
    Huang, Kaiqi
    COMPUTER VISION - ECCV 2024, PT LXXXVII, 2025, 15145 : 265 - 281
  • [10] Exploring and Exploiting Data-Free Model Stealing
    Hong, Chi
    Huang, Jiyue
    Birke, Robert
    Chen, Lydia Y.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT V, 2023, 14173 : 20 - 35