Overcoming Catastrophic Forgetting in Continual Learning by Exploring Eigenvalues of Hessian Matrix

被引:13
|
作者
Kong, Yajing [1 ,2 ]
Liu, Liu [1 ,2 ]
Chen, Huanhuan [3 ]
Kacprzyk, Janusz [4 ]
Tao, Dacheng [1 ,2 ]
机构
[1] Univ Sydney, Sydney AI Ctr, Fac Engn, Darlington, NSW 2008, Australia
[2] Univ Sydney, Sch Comp Sci, Fac Engn, Darlington, NSW 2008, Australia
[3] Univ Sci & Technol China, Sch Comp Sci & Technol, Hefei 230027, Peoples R China
[4] Polish Acad Sci, Syst Res Inst, PL-01447 Warsaw, Poland
关键词
Task analysis; Convergence; Eigenvalues and eigenfunctions; Data models; Training; Upper bound; Loss measurement; Catastrophic forgetting; continual learning (CL); incremental learning; lifelong learning; NEURAL-NETWORKS; MEMORY;
D O I
10.1109/TNNLS.2023.3292359
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks tend to suffer performance deterioration on previous tasks when they are applied to multiple tasks sequentially without access to previous data. The problem is commonly known as catastrophic forgetting, a significant challenge in continual learning (CL). To overcome the catastrophic forgetting, regularization-based CL methods construct a regularization-based term, which can be considered as the approximation loss function of previous tasks, to penalize the update of parameters. However, the rigorous theoretical analysis of regularization-based methods is limited. Therefore, we theoretically analyze the forgetting and the convergence properties of regularization-based methods. The theoretical results demonstrate that the upper bound of the forgetting has a relationship with the maximum eigenvalue of the Hessian matrix. Hence, to decrease the upper bound of the forgetting, we propose eiGenvalues ExplorAtion Regularization-based (GEAR) method, which explores the geometric properties of the approximation loss of prior tasks regarding the maximum eigenvalue. Extensive experimental results demonstrate that our method mitigates catastrophic forgetting and outperforms existing regularization-based methods.
引用
收藏
页码:16196 / 16210
页数:15
相关论文
共 50 条
  • [1] Quantum Continual Learning Overcoming Catastrophic Forgetting
    Jiang, Wenjie
    Lu, Zhide
    Deng, Dong-Ling
    CHINESE PHYSICS LETTERS, 2022, 39 (05)
  • [2] PNSP: Overcoming catastrophic forgetting using Primary Null Space Projection in continual learning
    Zhou, DaiLiang
    Song, YongHong
    PATTERN RECOGNITION LETTERS, 2024, 179 : 137 - 143
  • [3] Overcoming catastrophic forgetting in molecular property prediction using continual learning of sequential episodes
    Ranjan, Sakshi
    Singh, Sanjay Kumar
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 267
  • [4] Overcoming Catastrophic Forgetting Using Sparse Coding and Meta Learning
    Hurtado, Julio
    Lobel, Hans
    Soto, Alvaro
    IEEE ACCESS, 2021, 9 : 88279 - 88290
  • [5] Knowledge Lock: Overcoming Catastrophic Forgetting in Federated Learning
    Wei, Guoyizhe
    Li, Xiu
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2022, PT I, 2022, 13280 : 601 - 612
  • [6] A Continual Learning Survey: Defying Forgetting in Classification Tasks
    De Lange, Matthias
    Aljundi, Rahaf
    Masana, Marc
    Parisot, Sarah
    Jia, Xu
    Leonardis, Ales
    Slabaugh, Greg
    Tuytelaars, Tinne
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) : 3366 - 3385
  • [7] Continual Learning for Instance Segmentation to Mitigate Catastrophic Forgetting
    Lee, Jeong Jun
    Lee, Seung Il
    Kim, Hyun
    18TH INTERNATIONAL SOC DESIGN CONFERENCE 2021 (ISOCC 2021), 2021, : 85 - 86
  • [8] Assessment of catastrophic forgetting in continual credit card fraud detection
    Lebichot, B.
    Siblini, W.
    Paldino, G. M.
    Le Borgne, Y. -A.
    Oble, F.
    Bontempi, G.
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 249
  • [9] Preventing Catastrophic Forgetting in Continual Learning of New Natural Language Tasks
    Kar, Sudipta
    Castellucci, Giuseppe
    Filice, Simone
    Malmasi, Shervin
    Rokhlenko, Oleg
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 3137 - 3145
  • [10] Overcoming Catastrophic Forgetting with Gaussian Mixture Replay
    Pfuelb, Benedikt
    Gepperth, Alexander
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,