A Comprehensive Survey of Continual Learning: Theory, Method and Application

被引:111
|
作者
Wang, Liyuan [1 ]
Zhang, Xingxing
Su, Hang
Zhu, Jun [1 ]
机构
[1] Tsinghua Univ, Tsinghua Bosch Joint Ctr ML, BNRist Ctr, Dept Comp Sci & Tech,Inst AI,THBI Lab, Beijing 100190, Peoples R China
关键词
Continual learning; incremental learning; lifelong learning; catastrophic forgetting; NEURAL-NETWORKS; REPLAY;
D O I
10.1109/TPAMI.2024.3367329
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
To cope with real-world dynamics, an intelligent system needs to incrementally acquire, update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as continual learning, provides a foundation for AI systems to develop themselves adaptively. In a general sense, continual learning is explicitly limited by catastrophic forgetting, where learning a new task usually results in a dramatic performance drop of the old tasks. Beyond this, increasingly numerous advances have emerged in recent years that largely extend the understanding and application of continual learning. The growing and widespread interest in this direction demonstrates its realistic significance as well as complexity. In this work, we present a comprehensive survey of continual learning, seeking to bridge the basic settings, theoretical foundations, representative methods, and practical applications. Based on existing theoretical and empirical results, we summarize the general objectives of continual learning as ensuring a proper stability-plasticity trade-off and an adequate intra/inter-task generalizability in the context of resource efficiency. Then we provide a state-of-the-art and elaborated taxonomy, extensively analyzing how representative strategies address continual learning, and how they are adapted to particular challenges in various applications. Through an in-depth discussion of promising directions, we believe that such a holistic perspective can greatly facilitate subsequent exploration in this field and beyond.
引用
收藏
页码:5362 / 5383
页数:22
相关论文
共 50 条
  • [21] On the Beneficial Effects of Reinjections for Continual Learning
    Solinas M.
    Reyboz M.
    Rousset S.
    Galliere J.
    Mainsant M.
    Bourrier Y.
    Molnos A.
    Mermillod M.
    SN Computer Science, 4 (1)
  • [22] Continual learning with invertible generative models
    Pomponi, Jary
    Scardapane, Simone
    Uncini, Aurelio
    NEURAL NETWORKS, 2023, 164 : 606 - 616
  • [23] RTRA: Rapid Training of Regularization-based Approaches in Continual Learning
    Nokhwal, Sahil
    Kumar, Nirman
    2023 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE, ISCMI, 2023, : 188 - 192
  • [24] Overcoming Catastrophic Forgetting in Continual Learning by Exploring Eigenvalues of Hessian Matrix
    Kong, Yajing
    Liu, Liu
    Chen, Huanhuan
    Kacprzyk, Janusz
    Tao, Dacheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) : 16196 - 16210
  • [25] Federated Continual Learning via Knowledge Fusion: A Survey
    Yang, Xin
    Yu, Hao
    Gao, Xin
    Wang, Hao
    Zhang, Junbo
    Li, Tianrui
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (08) : 3832 - 3850
  • [26] Recent Advances of Foundation Language Models-based Continual Learning: A Survey
    Yang, Yutao
    Zhou, Jie
    Ding, Xuan wen
    Huai, Tianyu
    Liu, Shunyu
    Chen, Qin
    Xie, Yuan
    He, Liang
    ACM COMPUTING SURVEYS, 2025, 57 (05)
  • [27] Logarithmic Continual Learning
    Masarczyk, Wojciech
    Wawrzynski, Pawel
    Marczak, Daniel
    Deja, Kamil
    Trzcinski, Tomasz
    IEEE ACCESS, 2022, 10 : 117001 - 117010
  • [28] Assessor-guided learning for continual environments
    Ma'sum, Muhammad Anwar
    Pratama, Mahardhika
    Lughofer, Edwin
    Ding, Weiping
    Jatmiko, Wisnu
    INFORMATION SCIENCES, 2023, 640
  • [29] Continual Learning Using Bayesian Neural Networks
    Li, Honglin
    Barnaghi, Payam
    Enshaeifare, Shirin
    Ganz, Frieder
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (09) : 4243 - 4252
  • [30] Continual Learning with Sparse Progressive Neural Networks
    Ergun, Esra
    Toreyin, Behcet Ugur
    2020 28TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2020,