Connection-Based Knowledge Transfer for Class Incremental Learning

被引:0
|
作者
Zhao, Guangzhi [1 ]
Mu, Kedian [1 ]
机构
[1] Peking Univ, Sch Math Sci, Beijing, Peoples R China
来源
2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN | 2023年
基金
中国国家自然科学基金;
关键词
class incremental learning; one class classification; contrastive learning;
D O I
10.1109/IJCNN54540.2023.10191696
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the problem of class incremental learning (CIL), where an agent aims to learn new classes continually without forgetting previous ones. As one of the mainstream paradigms of incremental learning, parameter isolation methods prevent forgetting by allocating different model parameters to each task, but knowledge transfer across tasks is difficult and usually overlooked. As a consequence, the discriminability between old and new classes is limited, especially when training data of old classes is not accessible. In this paper, we propose a new data-free approach named Twin Contrastive Networks (TCN) for CIL by utilizing the connections among tasks and network parameters. Specifically, we treat CIL as a sequence of one-class classification tasks and train separate classifiers to identify each class. To facilitate knowledge transfer and make full use of accumulated knowledge, a twin network structure is adopted to learn different feature representations for future use. While encountering new classes, previous twin networks are utilized directly by a contrastive loss to improve the model's discriminability. TCN avoids catastrophic forgetting by fixing all learnt parameters and leverages prior knowledge contained in networks. Experiments on three widely used incremental learning benchmarks verify the effectiveness of TCN.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Adaptive knowledge transfer for class incremental learning
    Feng, Zhikun
    Zhou, Mian
    Gao, Zan
    Stefanidis, Angelos
    Su, Jionglong
    Dang, Kang
    Li, Chuanhui
    PATTERN RECOGNITION LETTERS, 2024, 183 : 165 - 171
  • [2] Knowledge aggregation networks for class incremental learning
    Fu, Zhiling
    Wang, Zhe
    Xu, Xinlei
    Li, Dongdong
    Yang, Hai
    PATTERN RECOGNITION, 2023, 137
  • [3] Hyperspectral Image Classification Based on Class-Incremental Learning with Knowledge Distillation
    Xu, Meng
    Zhao, Yuanyuan
    Liang, Yajun
    Ma, Xiaorui
    REMOTE SENSING, 2022, 14 (11)
  • [4] CLASS-INCREMENTAL LEARNING FOR REMOTE SENSING IMAGES BASED ON KNOWLEDGE DISTILLATION
    Song, Jingduo
    Jia, Hecheng
    Xu, Feng
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 5026 - 5028
  • [5] Semantic Knowledge Guided Class-Incremental Learning
    Wang, Shaokun
    Shi, Weiwei
    Dong, Songlin
    Gao, Xinyuan
    Song, Xiang
    Gong, Yihong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (10) : 5921 - 5931
  • [6] Rebalancing network with knowledge stability for class incremental learning
    Song, Jialun
    Chen, Jian
    Du, Lan
    PATTERN RECOGNITION, 2024, 153
  • [7] KiCi: A Knowledge Importance Based Class Incremental Learning Method forWearable Activity Recognition
    Guo, Shuai
    Gu, Yang
    Wen, Shijie
    Ma, Yuan
    Chen, Yiqiang
    Wang, Jiwei
    Hu, Chunyu
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 646 - 655
  • [9] Knowledge fusion distillation and gradient-based data distillation for class-incremental learning
    Xiong, Lin
    Guan, Xin
    Xiong, Hailing
    Zhu, Kangwen
    Zhang, Fuqing
    NEUROCOMPUTING, 2025, 622
  • [10] TLCE: Transfer-Learning Based Classifier Ensembles for Few-Shot Class-Incremental Learning
    Wang, Shuangmei
    Cao, Yang
    Wu, Tieru
    NEURAL PROCESSING LETTERS, 2024, 56 (04)