Adaptive Knowledge Matching for Exemplar-Free Class-Incremental Learning

被引:0
|
作者
Chen, Runhang [1 ]
Jing, Xiao-Yuan [1 ,2 ,3 ]
Chen, Haowen [4 ]
机构
[1] Wuhan Univ, Sch Comp Sci, Wuhan 430072, Peoples R China
[2] Guangdong Univ Petrochem Technol, Guangdong Prov Key Lab Petrochem Equipment Fault, Maoming 525000, Peoples R China
[3] Guangdong Univ Petrochem Technol, Sch Comp, Maoming 525000, Peoples R China
[4] Informat Engn Univ, Sch Cyber Sci & Engn, Zhengzhou 450001, Peoples R China
来源
PATTERN RECOGNITION AND COMPUTER VISION, PT III, PRCV 2024 | 2025年 / 15033卷
关键词
Class-Incremental Learning; Exemplar-Free Class-Incremental Learning; Knowledge Distillation;
D O I
10.1007/978-981-97-8502-5_21
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Exemplar-free class-incremental learning (EFCIL) presents a significant challenge, requiring models to learn tasks sequentially without accessing data from previous tasks. This challenge is exacerbated when the initial dataset is insufficient for facilitating model adaptation to subsequent tasks. Existing methods often employ a joint loss function to improve model adaptability and knowledge retention. However, these methods still face challenges in mitigating forgetting of knowledge from old classes. To address this issue, we propose a new approach called Adaptive Knowledge Matching (AKM). We first adopt a log-cosh loss function to better retain previously learned knowledge. Then, we introduce an adaptive weighting strategy that dynamically balances knowledge from old and new classes. Experiments on benchmark datasets (CIFAR100, Tiny-ImageNet, and ImageNet-Subset) demonstrate the effectiveness of the proposed approach.
引用
收藏
页码:289 / 303
页数:15
相关论文
共 50 条
  • [21] Multi-granularity knowledge distillation and prototype consistency regularization for class-incremental learning
    Shi, Yanyan
    Shi, Dianxi
    Qiao, Ziteng
    Wang, Zhen
    Zhang, Yi
    Yang, Shaowu
    Qiu, Chunping
    NEURAL NETWORKS, 2023, 164 : 617 - 630
  • [22] Squeezing More Past Knowledge for Online Class-Incremental Continual Learning
    Yu, Da
    Zhang, Mingyi
    Li, Mantian
    Zha, Fusheng
    Zhang, Junge
    Sun, Lining
    Huang, Kaiqi
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2023, 10 (03) : 722 - 736
  • [23] Hyperspectral Image Classification Based on Class-Incremental Learning with Knowledge Distillation
    Xu, Meng
    Zhao, Yuanyuan
    Liang, Yajun
    Ma, Xiaorui
    REMOTE SENSING, 2022, 14 (11)
  • [24] Deep Learning for Class-Incremental Learning: A Survey
    Zhou D.-W.
    Wang F.-Y.
    Ye H.-J.
    Zhan D.-C.
    Jisuanji Xuebao/Chinese Journal of Computers, 2023, 46 (08): : 1577 - 1605
  • [25] Knowledge Representation by Generic Models for Few-Shot Class-Incremental Learning
    Chen, Xiaodong
    Jiang, Weijie
    Huang, Zhiyong
    Su, Jiangwen
    Yu, Yuanlong
    ADVANCES IN NATURAL COMPUTATION, FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY, ICNC-FSKD 2022, 2023, 153 : 1237 - 1247
  • [26] DYNAMIC REPLAY TRAINING FOR CLASS-INCREMENTAL LEARNING
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 5915 - 5919
  • [27] A class-incremental learning approach for learning feature-compatible embeddings
    An, Hongchao
    Yang, Jing
    Zhang, Xiuhua
    Ruan, Xiaoli
    Wu, Yuankai
    Li, Shaobo
    Hu, Jianjun
    NEURAL NETWORKS, 2024, 180
  • [28] Curiosity-Driven Class-Incremental Learning via Adaptive Sample Selection
    Hu, Qinghua
    Gao, Yucong
    Cao, Bing
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (12) : 8660 - 8673
  • [29] DSS: A Diverse Sample Selection Method to Preserve Knowledge in Class-Incremental Learning
    Nokhwal, Sahil
    Kumar, Nirman
    2023 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE, ISCMI, 2023, : 178 - 182
  • [30] Co-Transport for Class-Incremental Learning
    Zhou, Da-Wei
    Ye, Han-Jia
    Zhan, De-Chuan
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 1645 - 1654