Class incremental learning with KL constraint and multi-strategy exemplar selection for classification based on MMFA model

被引:0
作者
Li, Yang [1 ]
Du, Lan [1 ]
Chen, Jian [1 ]
机构
[1] Xidian Univ, Natl Key Lab Radar Signal Proc, Xian 710071, Peoples R China
基金
美国国家科学基金会;
关键词
Class incremental learning (CIL); Kullback-Leibler (KL) constraint; Exemplar selection; Classification; RECOGNITION;
D O I
10.1016/j.ins.2024.121266
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Class incremental learning (CIL) can learn new classes continuously by updating the model rather than retraining a model from scratch with all seen classes like traditional offline learning, therefore, CIL is more suitable for classification in dynamic environments, where new classes are captured progressively. However, the key knowledge of old classes will be lost due to the update mode of CIL, leading to the catastrophic forgetting (CF) problem. In this paper, a novel CIL method with Kullback-Leibler constraint and multi-strategy exemplar selection (CIL-KLMES) is proposed for classification based on max-margin factor analysis (MMFA) model. To handle the CF problem, CIL-KLMES imposes a Kullback-Leibler (KL) divergence term on the important parameters when updating the model to restrict the parameters' distribution to be similar, thus preventing the updated model from deviating too much from the previous model and preserving the knowledge of old classes. Moreover, CIL-KLMES selects a few representative exemplars from old classes based on the robust description of data distribution and classification decision boundary. By replaying representative exemplars to update the model together with new class data, the key knowledge of old classes can be further preserved. Therefore, the CF problem can be alleviated sufficiently. Experimental results demonstrate the effectiveness of CIL-KLMES.
引用
收藏
页数:21
相关论文
共 50 条
[1]   BACKPROPAGATION AND STOCHASTIC GRADIENT DESCENT METHOD [J].
AMARI, S .
NEUROCOMPUTING, 1993, 5 (4-5) :185-196
[2]  
Anguita Davide., 2013, ESANN, V3, page, P3
[3]  
[Anonymous], 2007, J. Electron. Imaging, V16
[4]  
Beal M.J., 2003, Variational algorithms for approximate Bayesian inference
[5]   Class-Incremental Continual Learning Into the eXtended DER-Verse [J].
Boschini, Matteo ;
Bonicelli, Lorenzo ;
Buzzega, Pietro ;
Porrello, Angelo ;
Calderara, Simone .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (05) :5497-5512
[6]   Riemannian Walk for Incremental Learning: Understanding Forgetting and Intransigence [J].
Chaudhry, Arslan ;
Dokania, Puneet K. ;
Ajanthan, Thalaiyasingam ;
Torr, Philip H. S. .
COMPUTER VISION - ECCV 2018, PT XI, 2018, 11215 :556-572
[7]   Task-specific parameter decoupling for class incremental learning [J].
Chen, Runhang ;
Jing, Xiao-Yuan ;
Wu, Fei ;
Zheng, Wei ;
Hao, Yaru .
INFORMATION SCIENCES, 2023, 651
[8]  
Chen Y.-T., 2023, S-qr-elm: Scalable Qr-Decomposition-Based Extreme Learning Machine Engine Supporting Online ClassIncremental Learning for Ecg-Based User Identification
[9]   Histograms of oriented gradients for human detection [J].
Dalal, N ;
Triggs, B .
2005 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOL 1, PROCEEDINGS, 2005, :886-893
[10]   Class Boundary Exemplar Selection Based Incremental Learning for Automatic Target Recognition [J].
Dang, Sihang ;
Cao, Zongjie ;
Cui, Zongyong ;
Pi, Yiming ;
Liu, Nengyuan .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2020, 58 (08) :5782-5792