Connection-Based Knowledge Transfer for Class Incremental Learning

被引:0
|
作者
Zhao, Guangzhi [1 ]
Mu, Kedian [1 ]
机构
[1] Peking Univ, Sch Math Sci, Beijing, Peoples R China
来源
2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN | 2023年
基金
中国国家自然科学基金;
关键词
class incremental learning; one class classification; contrastive learning;
D O I
10.1109/IJCNN54540.2023.10191696
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the problem of class incremental learning (CIL), where an agent aims to learn new classes continually without forgetting previous ones. As one of the mainstream paradigms of incremental learning, parameter isolation methods prevent forgetting by allocating different model parameters to each task, but knowledge transfer across tasks is difficult and usually overlooked. As a consequence, the discriminability between old and new classes is limited, especially when training data of old classes is not accessible. In this paper, we propose a new data-free approach named Twin Contrastive Networks (TCN) for CIL by utilizing the connections among tasks and network parameters. Specifically, we treat CIL as a sequence of one-class classification tasks and train separate classifiers to identify each class. To facilitate knowledge transfer and make full use of accumulated knowledge, a twin network structure is adopted to learn different feature representations for future use. While encountering new classes, previous twin networks are utilized directly by a contrastive loss to improve the model's discriminability. TCN avoids catastrophic forgetting by fixing all learnt parameters and leverages prior knowledge contained in networks. Experiments on three widely used incremental learning benchmarks verify the effectiveness of TCN.
引用
收藏
页数:8
相关论文
共 50 条
  • [21] DSS: A Diverse Sample Selection Method to Preserve Knowledge in Class-Incremental Learning
    Nokhwal, Sahil
    Kumar, Nirman
    2023 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE, ISCMI, 2023, : 178 - 182
  • [22] Knowledge Representation by Generic Models for Few-Shot Class-Incremental Learning
    Chen, Xiaodong
    Jiang, Weijie
    Huang, Zhiyong
    Su, Jiangwen
    Yu, Yuanlong
    ADVANCES IN NATURAL COMPUTATION, FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY, ICNC-FSKD 2022, 2023, 153 : 1237 - 1247
  • [23] Self-distilled Knowledge Delegator for Exemplar-free Class Incremental Learning
    Ye, Fanfan
    Ma, Liang
    Zhong, Qiaoyong
    Xie, Di
    Pu, Shiliang
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [24] Class-Incremental Learning: A Survey
    Zhou, Da-Wei
    Wang, Qi-Wei
    Qi, Zhi-Hong
    Ye, Han-Jia
    Zhan, De-Chuan
    Liu, Ziwei
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (12) : 9851 - 9873
  • [25] Rethinking class orders and transferability in class incremental learning
    He, Chen
    Wang, Ruiping
    Chen, Xilin
    PATTERN RECOGNITION LETTERS, 2022, 161 : 67 - 73
  • [26] Online Class-Incremental Learning in Image Classification Based on Attention
    Du, Baoyu
    Wei, Zhonghe
    Cheng, Jinyong
    Lv, Guohua
    Dai, Xiaoyu
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VII, 2024, 14431 : 487 - 499
  • [27] Class Incremental Learning Based on Playback Images Generated by Classification Network
    Zhu, Qiuyu
    Zhang, Yunxiao
    Zhuo, Yunhang
    Chen, Junli
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2025,
  • [28] Deep Learning for Class-Incremental Learning: A Survey
    Zhou D.-W.
    Wang F.-Y.
    Ye H.-J.
    Zhan D.-C.
    Jisuanji Xuebao/Chinese Journal of Computers, 2023, 46 (08): : 1577 - 1605
  • [29] Multi-granularity knowledge distillation and prototype consistency regularization for class-incremental learning
    Shi, Yanyan
    Shi, Dianxi
    Qiao, Ziteng
    Wang, Zhen
    Zhang, Yi
    Yang, Shaowu
    Qiu, Chunping
    NEURAL NETWORKS, 2023, 164 : 617 - 630
  • [30] ISM-Net: Mining incremental semantics for class incremental learning
    Qiu, Zihuan
    Xu, Linfeng
    Wang, Zhichuan
    Wu, Qingbo
    Meng, Fanman
    Li, Hongliang
    NEUROCOMPUTING, 2023, 523 : 130 - 143