Latent Coreset Sampling based Data-Free Continual Learning

被引:3
|
作者
Wang, Zhuoyi [1 ]
Li, Dingcheng [1 ]
Li, Ping [1 ]
机构
[1] Baidu Res, Cognit Comp Lab, Bellevue, WA 98004 USA
来源
PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022 | 2022年
关键词
Continual Learning; Data-free; Coreset Sampling; Latent representation;
D O I
10.1145/3511808.3557375
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Catastrophic forgetting poses a major challenge in continual learning where the old knowledge is forgotten when the model is updated on new tasks. Existing solutions tend to solve this challenge through generative models or exemplar-replay strategies. However, such methods may not alleviate the issue that the low-quality samples are generated or selected for the replay, which would directly reduce the effectiveness of the model, especially in the class imbalance, noise, or redundancy scenarios. Accordingly, how to select a suitable coreset during continual learning becomes significant in such setting. In this work, we propose a novel approach that leverages continual coreset sampling (CCS) to address these challenges. We aim to select the most representative subsets during each iteration. When the model is trained on new tasks, it closely approximates/matches the gradient of both the previous and current tasks with respect to the model parameters. This way, adaptation of the model to new datasets could be more efficient. Furthermore, different from the old data storage for maintaining the old knowledge, our approach choose to preserving them in the latent space. We augment the previous classes in the embedding space as the pseudo sample vectors from the old encoder output, strengthened by the joint training with selected new data. It could avoid data privacy invasions in a real-world application when we update the model. Our experiments validate the effectiveness of our proposed approach over various CV/NLP datasets under against current baselines, and we also indicate the obvious improvement of model adaptation and forgetting reduction in a data-free manner.
引用
收藏
页码:2078 / 2087
页数:10
相关论文
共 50 条
  • [41] An Investigation of Replay-based Approaches for Continual Learning
    Bagus, Benedikt
    Gepperth, Alexander
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [42] Digital Twin for Continual Learning in Location Based Services
    Lombardo, Gianfranco
    Picone, Marco
    Mamei, Marco
    Mordonini, Monica
    Poggi, Agostino
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 127
  • [43] Continual learning classification method for time-varying data space based on artificial immune system
    Li, Dong
    Liu, Shulin
    Gao, Furong
    Sun, Xin
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2021, 40 (05) : 8741 - 8754
  • [44] EdDSA Shield: Fortifying Machine Learning Against Data Poisoning Threats in Continual Learning
    Nageswari, Akula
    Sanjeevulu, Vasundra
    PROCEEDINGS OF THE 5TH INTERNATIONAL CONFERENCE ON DATA SCIENCE, MACHINE LEARNING AND APPLICATIONS, VOL 1, ICDSMLA 2023, 2025, 1273 : 1018 - 1028
  • [45] Imbalanced-Free Memory Selection Scheme Based Continual Learning by Using K-means Clustering
    Lee, Changha
    Jeon, Minsu
    Yang, Eunju
    Kim, Seong-Hwan
    Youn, Chan-Hyun
    2019 10TH INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGY CONVERGENCE (ICTC): ICT CONVERGENCE LEADING THE AUTONOMOUS FUTURE, 2019, : 910 - 915
  • [46] Quantum continual learning of quantum data realizing knowledge backward transfer
    Situ, Haozhen
    Lu, Tianxiang
    Pan, Minghua
    Li, Lvzhou
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2023, 620
  • [48] Exemplar-Free Continual Representation Learning via Learnable Drift Compensation
    Gomez-Villa, Alex
    Goswami, Dipam
    Wang, Kai
    Bagdanov, Andrew D.
    Twardowski, Bartlomiej
    van de Weijer, Joost
    COMPUTER VISION-ECCV 2024, PT VII, 2025, 15065 : 473 - 490
  • [49] Tensor decision trees for continual learning from drifting data streams
    Bartosz Krawczyk
    Machine Learning, 2021, 110 : 3015 - 3035
  • [50] Continual Learning with Deep Neural Networks in Physiological Signal Data: A Survey
    Li, Ao
    Li, Huayu
    Yuan, Geng
    HEALTHCARE, 2024, 12 (02)