Theory on Forgetting and Generalization of Continual Learning

被引:0
作者
Lin, Sen [1 ]
Ju, Peizhong [1 ]
Liang, Yingbin [1 ]
Shroff, Ness [1 ,2 ]
机构
[1] Ohio State Univ, Dept ECE, Columbus, OH 43210 USA
[2] Ohio State Univ, Dept CSE, Columbus, OH 43210 USA
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202 | 2023年 / 202卷
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual learning (CL), which aims to learn a sequence of tasks, has attracted significant recent attention. However, most work has focused on the experimental performance of CL, and theoretical studies of CL are still limited. In particular, there is a lack of understanding on what factors are important and how they affect "catastrophic forgetting" and generalization performance. To fill this gap, our theoretical analysis, under overparameterized linear models, provides the first-known explicit form of the expected forgetting and generalization error for a general CL setup with an arbitrary number of tasks. Further analysis of such a key result yields a number of theoretical explanations about how overparameterization, task similarity, and task ordering affect both forgetting and generalization error of CL. More interestingly, by conducting experiments on real datasets using deep neural networks (DNNs), we show that some of these insights even go beyond the linear models and can be carried over to practical setups. In particular, we use concrete examples to show that our results not only explain some interesting empirical observations in recent studies, but also motivate better practical algorithm designs of CL.
引用
收藏
页数:23
相关论文
共 51 条
[1]   Memory Aware Synapses: Learning What (not) to Forget [J].
Aljundi, Rahaf ;
Babiloni, Francesca ;
Elhoseiny, Mohamed ;
Rohrbach, Marcus ;
Tuytelaars, Tinne .
COMPUTER VISION - ECCV 2018, PT III, 2018, 11207 :144-161
[2]   Theoretical Understanding of the Information Flow on Continual Learning Performance [J].
Andle, Joshua ;
Sekeh, Salimeh Yasaei .
COMPUTER VISION, ECCV 2022, PT XII, 2022, 13672 :86-101
[3]  
[Anonymous], IEEE INT SYMP INFO
[4]  
[Anonymous], P NATL ACAD SCI USA
[5]  
[Anonymous], 2019, INT C MACH LEARN
[6]   Statistical Mechanical Analysis of Catastrophic Forgetting in Continual Learning with Teacher and Student Networks [J].
Asanuma, Haruka ;
Takagi, Shiro ;
Nagano, Yoshihiro ;
Yoshida, Yuki ;
Igarashi, Yasuhiko ;
Okada, Masato .
JOURNAL OF THE PHYSICAL SOCIETY OF JAPAN, 2021, 90 (10)
[7]   Benign overfitting in linear regression [J].
Bartlett, Peter L. ;
Long, Philip M. ;
Lugosi, Gabor ;
Tsigler, Alexander .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2020, 117 (48) :30063-30070
[8]  
Belkin M., 2019, arXiv
[9]   Two Models of Double Descent for Weak Features [J].
Belkin, Mikhail ;
Hsu, Daniel ;
Xu, Ji .
SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (04) :1167-1180
[10]  
Belkin Mikhail, 2018, PMLR, P541