Knowledge aggregation networks for class incremental learning

被引:20
|
作者
Fu, Zhiling [1 ,2 ]
Wang, Zhe [1 ,2 ]
Xu, Xinlei [1 ,2 ]
Li, Dongdong [2 ]
Yang, Hai [2 ]
机构
[1] East China Univ Sci & Technol, Key Lab Smart Mfg Energy Chem Proc, Minist Educ, Shanghai 200237, Peoples R China
[2] East China Univ Sci & Technol, Dept Comp Sci & Engn, Shanghai 200237, Peoples R China
关键词
Class incremental learning; Catastrophic forgetting; Dual-branch network; Knowledge aggregation; Model compression; CONSOLIDATION;
D O I
10.1016/j.patcog.2023.109310
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most existing class incremental learning methods rely on storing old exemplars to avoid catastrophic forgetting. However, these methods inevitably face the gradient conflict problem, the inherent conflict between new streaming knowledge and existing knowledge in the gradient direction. To alleviate gradient conflict, this paper reuses the previous knowledge and expands the branch to accommodate new concepts instead of fine-tuning the original models. Specifically, this paper designs a novel dual-branch network called Knowledge Aggregation Networks. The previously trained model is frozen as a branch to retain existing knowledge, and a consistent trainable network is constructed as the other branch to learn new concepts. An adaptive feature fusion module is adopted to dynamically balance the two branches' information during training. Moreover, a model compression stage maintains the dual-branch structure. Extensive experiments on CIFAR-10 0, ImageNet-Sub, and ImageNet show that our method significantly outperforms the other methods and effectively balances stability and plasticity. & COPY; 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Rebalancing network with knowledge stability for class incremental learning
    Song, Jialun
    Chen, Jian
    Du, Lan
    PATTERN RECOGNITION, 2024, 153
  • [2] Adaptive knowledge transfer for class incremental learning
    Feng, Zhikun
    Zhou, Mian
    Gao, Zan
    Stefanidis, Angelos
    Su, Jionglong
    Dang, Kang
    Li, Chuanhui
    PATTERN RECOGNITION LETTERS, 2024, 183 : 165 - 171
  • [3] Semantic Knowledge Guided Class-Incremental Learning
    Wang, Shaokun
    Shi, Weiwei
    Dong, Songlin
    Gao, Xinyuan
    Song, Xiang
    Gong, Yihong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (10) : 5921 - 5931
  • [4] Squeezing More Past Knowledge for Online Class-Incremental Continual Learning
    Yu, Da
    Zhang, Mingyi
    Li, Mantian
    Zha, Fusheng
    Zhang, Junge
    Sun, Lining
    Huang, Kaiqi
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2023, 10 (03) : 722 - 736
  • [5] Squeezing More Past Knowledge for Online Class-Incremental Continual Learning
    Da Yu
    Mingyi Zhang
    Mantian Li
    Fusheng Zha
    Junge Zhang
    Lining Sun
    Kaiqi Huang
    IEEE/CAAJournalofAutomaticaSinica, 2023, 10 (03) : 722 - 736
  • [6] Hyper-feature aggregation and relaxed distillation for class incremental learning
    Wu, Ran
    Liu, Huanyu
    Yue, Zongcheng
    Li, Jun-Bao
    Sham, Chiu-Wing
    PATTERN RECOGNITION, 2024, 152
  • [8] Connection-Based Knowledge Transfer for Class Incremental Learning
    Zhao, Guangzhi
    Mu, Kedian
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [9] DSS: A Diverse Sample Selection Method to Preserve Knowledge in Class-Incremental Learning
    Nokhwal, Sahil
    Kumar, Nirman
    2023 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE, ISCMI, 2023, : 178 - 182
  • [10] Class-Incremental Learning: A Survey
    Zhou, Da-Wei
    Wang, Qi-Wei
    Qi, Zhi-Hong
    Ye, Han-Jia
    Zhan, De-Chuan
    Liu, Ziwei
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (12) : 9851 - 9873