Overcoming Catastrophic Forgetting in Continual Learning by Exploring Eigenvalues of Hessian Matrix

被引:13
|
作者
Kong, Yajing [1 ,2 ]
Liu, Liu [1 ,2 ]
Chen, Huanhuan [3 ]
Kacprzyk, Janusz [4 ]
Tao, Dacheng [1 ,2 ]
机构
[1] Univ Sydney, Sydney AI Ctr, Fac Engn, Darlington, NSW 2008, Australia
[2] Univ Sydney, Sch Comp Sci, Fac Engn, Darlington, NSW 2008, Australia
[3] Univ Sci & Technol China, Sch Comp Sci & Technol, Hefei 230027, Peoples R China
[4] Polish Acad Sci, Syst Res Inst, PL-01447 Warsaw, Poland
关键词
Task analysis; Convergence; Eigenvalues and eigenfunctions; Data models; Training; Upper bound; Loss measurement; Catastrophic forgetting; continual learning (CL); incremental learning; lifelong learning; NEURAL-NETWORKS; MEMORY;
D O I
10.1109/TNNLS.2023.3292359
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks tend to suffer performance deterioration on previous tasks when they are applied to multiple tasks sequentially without access to previous data. The problem is commonly known as catastrophic forgetting, a significant challenge in continual learning (CL). To overcome the catastrophic forgetting, regularization-based CL methods construct a regularization-based term, which can be considered as the approximation loss function of previous tasks, to penalize the update of parameters. However, the rigorous theoretical analysis of regularization-based methods is limited. Therefore, we theoretically analyze the forgetting and the convergence properties of regularization-based methods. The theoretical results demonstrate that the upper bound of the forgetting has a relationship with the maximum eigenvalue of the Hessian matrix. Hence, to decrease the upper bound of the forgetting, we propose eiGenvalues ExplorAtion Regularization-based (GEAR) method, which explores the geometric properties of the approximation loss of prior tasks regarding the maximum eigenvalue. Extensive experimental results demonstrate that our method mitigates catastrophic forgetting and outperforms existing regularization-based methods.
引用
收藏
页码:16196 / 16210
页数:15
相关论文
共 50 条
  • [21] Comparative Analysis of Catastrophic Forgetting in Metric Learning
    Huo, Jiahao
    van Zyl, Terence L.
    2020 7TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE (ISCMI 2020), 2020, : 68 - 72
  • [22] Overcoming Long-Term Catastrophic Forgetting Through Adversarial Neural Pruning and Synaptic Consolidation
    Peng, Jian
    Tang, Bo
    Jiang, Hao
    Li, Zhuo
    Lei, Yinjie
    Lin, Tao
    Li, Haifeng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (09) : 4243 - 4256
  • [23] Example forgetting and rehearsal in continual learning
    Benko, Beatrix
    PATTERN RECOGNITION LETTERS, 2024, 179 : 65 - 72
  • [24] A dynamic routing CapsNet based on increment prototype clustering for overcoming catastrophic forgetting
    Wang, Meng
    Guo, Zhengbing
    Li, Huafeng
    IET COMPUTER VISION, 2022, 16 (01) : 83 - 97
  • [25] Mitigating Catastrophic Forgetting with Complementary Layered Learning
    Mondesire, Sean
    Wiegand, R. Paul
    ELECTRONICS, 2023, 12 (03)
  • [26] SeNA-CNN: Overcoming Catastrophic Forgetting in Convolutional Neural Networks by Selective Network Augmentation
    Zacarias, Abel
    Alexandre, Luis A.
    ARTIFICIAL NEURAL NETWORKS IN PATTERN RECOGNITION, ANNPR 2018, 2018, 11081 : 102 - 112
  • [27] Ensemble Learning in Fixed Expansion Layer Networks for Mitigating Catastrophic Forgetting
    Coop, Robert
    Mishtal, Aaron
    Arel, Itamar
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (10) : 1623 - 1634
  • [28] Reducing Catastrophic Forgetting With Associative Learning: A Lesson From Fruit Flies
    Shen, Yang
    Dasgupta, Sanjoy
    Navlakha, Saket
    NEURAL COMPUTATION, 2023, 35 (11) : 1797 - 1819
  • [29] State Primitive Learning to Overcome Catastrophic Forgetting in Robotics
    Xiong, Fangzhou
    Liu, Zhiyong
    Huang, Kaizhu
    Yang, Xu
    Qiao, Hong
    COGNITIVE COMPUTATION, 2021, 13 (02) : 394 - 402
  • [30] State Primitive Learning to Overcome Catastrophic Forgetting in Robotics
    Fangzhou Xiong
    Zhiyong Liu
    Kaizhu Huang
    Xu Yang
    Hong Qiao
    Cognitive Computation, 2021, 13 : 394 - 402