Continuous transfer of neural network representational similarity for incremental learning

被引:43
|
作者
Tian, Songsong [1 ,2 ]
Li, Weijun [1 ,3 ,4 ]
Ning, Xin [1 ,3 ,4 ,5 ]
Ran, Hang [1 ]
Qin, Hong [1 ,3 ,4 ]
Tiwari, Prayag [6 ]
机构
[1] Chinese Acad Sci, Inst Semicond, Beijing 100083, Peoples R China
[2] Univ Chinese Acad Sci, Sch Elect Elect & Commun Engn, Beijing 100049, Peoples R China
[3] Univ Chinese Acad Sci, Ctr Mat Sci & Optoelect Engn, Beijing 100049, Peoples R China
[4] Univ Chinese Acad Sci, Sch Integrated Circuits, Beijing 100049, Peoples R China
[5] Zhongke Ruitu Technol Co Ltd, Beijing 100096, Peoples R China
[6] Halmstad Univ, Sch Informat Technol, S-30118 Halmstad, Sweden
关键词
Incremental learning; Pre-trained model; Knowledge distillation; Neural network representation;
D O I
10.1016/j.neucom.2023.126300
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The incremental learning paradigm in machine learning has consistently been a focus of academic research. It is similar to the way in which biological systems learn, and reduces energy consumption by avoiding excessive retraining. Existing studies utilize the powerful feature extraction capabilities of pre-trained models to address incremental learning, but there remains a problem of insufficient utiliza-tion of neural network feature knowledge. To address this issue, this paper proposes a novel method called Pre-trained Model Knowledge Distillation (PMKD) which combines knowledge distillation of neu-ral network representations and replay. This paper designs a loss function based on centered kernel align-ment to transfer neural network representations knowledge from the pre-trained model to the incremental model layer-by-layer. Additionally, the use of memory buffer for Dark Experience Replay helps the model retain past knowledge better. Experiments show that PMKD achieved superior perfor-mance on various datasets and different buffer sizes. Compared to other methods, our class incremental learning accuracy reached the best performance. The open-source code is published athttps://github.-com/TianSongS/PMKD-IL.(c) 2023 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Automatic Security Classification Based on Incremental Learning and Similarity Comparison
    Liang, Yan
    Wen, Zepeng
    Tao, Yizheng
    Li, GongLiang
    Guo, Bing
    PROCEEDINGS OF 2019 IEEE 8TH JOINT INTERNATIONAL INFORMATION TECHNOLOGY AND ARTIFICIAL INTELLIGENCE CONFERENCE (ITAIC 2019), 2019, : 812 - 817
  • [42] Improve the Performance and Stability of Incremental Learning by a Similarity Harmonizing Mechanism
    Ma, Jing
    Liao, Mingjie
    Zhang, Lei
    IEEE ACCESS, 2022, 10 : 117429 - 117438
  • [43] TL-NID: Deep Neural Network with Transfer Learning for Network Intrusion Detection
    Masum, Mohammad
    Shahriar, Hossain
    INTERNATIONAL CONFERENCE FOR INTERNET TECHNOLOGY AND SECURED TRANSACTIONS (ICITST-2020), 2020, : 64 - 70
  • [44] Broad Convolutional Neural Network Based Industrial Process Fault Diagnosis With Incremental Learning Capability
    Yu, Wanke
    Zhao, Chunhui
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2020, 67 (06) : 5081 - 5091
  • [45] Recognizing the Gradual Changes in sEMG Characteristics Based on Incremental Learning of Wavelet Neural Network Ensemble
    Duan, Feng
    Dai, Lili
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2017, 64 (05) : 4276 - 4286
  • [46] Nonlinear single layer neural network training algorithm for incremental, nonstationary and distributed learning scenarios
    Martinez-Rego, David
    Fontenla-Romero, Oscar
    Alonso-Betanzos, Amparo
    PATTERN RECOGNITION, 2012, 45 (12) : 4536 - 4546
  • [47] Parameter incremental learning algorithm for neural networks
    Wan, Sheng
    Banta, Larry E.
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (06): : 1424 - 1438
  • [48] A neural tree with partial incremental learning capability
    Su, Mu-Chun
    Lo, Hsu-Hsun
    PROCEEDINGS OF 2007 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2007, : 6 - 11
  • [49] Cable fault diagnosis with generalization capability using incremental learning and deep convolutional neural network
    Chi, Peng
    Liang, Rui
    Hao, Chuncheng
    Li, Guochang
    Xin, Meng
    ELECTRIC POWER SYSTEMS RESEARCH, 2025, 241
  • [50] Transfer Incremental Learning Using Data Augmentation
    Hacene, Ghouthi Boukli
    Gripon, Vincent
    Farrugia, Nicolas
    Arzel, Matthieu
    Jezequel, Michel
    APPLIED SCIENCES-BASEL, 2018, 8 (12):