Knowledge aggregation networks for class incremental learning

被引:22
作者
Fu, Zhiling [1 ,2 ]
Wang, Zhe [1 ,2 ]
Xu, Xinlei [1 ,2 ]
Li, Dongdong [2 ]
Yang, Hai [2 ]
机构
[1] East China Univ Sci & Technol, Key Lab Smart Mfg Energy Chem Proc, Minist Educ, Shanghai 200237, Peoples R China
[2] East China Univ Sci & Technol, Dept Comp Sci & Engn, Shanghai 200237, Peoples R China
关键词
Class incremental learning; Catastrophic forgetting; Dual-branch network; Knowledge aggregation; Model compression; CONSOLIDATION;
D O I
10.1016/j.patcog.2023.109310
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most existing class incremental learning methods rely on storing old exemplars to avoid catastrophic forgetting. However, these methods inevitably face the gradient conflict problem, the inherent conflict between new streaming knowledge and existing knowledge in the gradient direction. To alleviate gradient conflict, this paper reuses the previous knowledge and expands the branch to accommodate new concepts instead of fine-tuning the original models. Specifically, this paper designs a novel dual-branch network called Knowledge Aggregation Networks. The previously trained model is frozen as a branch to retain existing knowledge, and a consistent trainable network is constructed as the other branch to learn new concepts. An adaptive feature fusion module is adopted to dynamically balance the two branches' information during training. Moreover, a model compression stage maintains the dual-branch structure. Extensive experiments on CIFAR-10 0, ImageNet-Sub, and ImageNet show that our method significantly outperforms the other methods and effectively balances stability and plasticity. & COPY; 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
[41]   Class Incremental Learning With Less Forgetting Direction and Equilibrium Point [J].
Wen, Haitao ;
Qiu, Heqian ;
Wang, Lanxiao ;
Cheng, Haoyang ;
Li, Hongliang .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2025, 35 (02) :1150-1164
[42]   Class incremental learning via Multi-hinge distillation [J].
Lin, Qianhe ;
Yu, Yuanlong ;
Huang, Zhiyong .
2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, :5996-6000
[43]   A Class-Incremental Learning Method for PCB Defect Detection [J].
Ge, Quanbo ;
Wu, Ruilin ;
Wu, Yupei ;
Liu, Huaping .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2025, 74
[44]   Multi-granularity knowledge distillation and prototype consistency regularization for class-incremental learning [J].
Shi, Yanyan ;
Shi, Dianxi ;
Qiao, Ziteng ;
Wang, Zhen ;
Zhang, Yi ;
Yang, Shaowu ;
Qiu, Chunping .
NEURAL NETWORKS, 2023, 164 :617-630
[45]   Class and Data-Incremental Learning Framework for Baggage Threat Segmentation via Knowledge Distillation [J].
Nasim, Ammara ;
Khan, Saad Mazhar ;
Salam, Anum Abdul ;
Shaukat, Arslan ;
Hassan, Taimur ;
Syed, Adeel M. ;
Akram, Muhammad Usman .
IEEE ACCESS, 2025, 13 :95977-96000
[46]   Knowledge fusion distillation and gradient-based data distillation for class-incremental learning [J].
Xiong, Lin ;
Guan, Xin ;
Xiong, Hailing ;
Zhu, Kangwen ;
Zhang, Fuqing .
NEUROCOMPUTING, 2025, 622
[47]   Class incremental learning with self-supervised pre-training and prototype learning [J].
Liu, Wenzhuo ;
Wu, Xin-Jian ;
Zhu, Fei ;
Yu, Ming-Ming ;
Wang, Chuang ;
Liu, Cheng-Lin .
PATTERN RECOGNITION, 2025, 157
[48]   ISM-Net: Mining incremental semantics for class incremental learning [J].
Qiu, Zihuan ;
Xu, Linfeng ;
Wang, Zhichuan ;
Wu, Qingbo ;
Meng, Fanman ;
Li, Hongliang .
NEUROCOMPUTING, 2023, 523 :130-143
[49]   CLASS INCREMENTAL LEARNING FOR VIDEO ACTION CLASSIFICATION [J].
Ma, Jiawei ;
Tao, Xiaoyu ;
Ma, Jianxing ;
Hong, Xiaopeng ;
Gong, Yihong .
2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, :504-508
[50]   Class Incremental Learning: A Review and Performance Evaluation [J].
Zhu F. ;
Zhang X.-Y. ;
Liu C.-L. .
Zidonghua Xuebao/Acta Automatica Sinica, 2023, 49 (03) :635-660