Dual Network Based Complementary Learning System for Continual Learning

被引:0
作者
Kumari, Geeta [1 ]
Song, Iickho [1 ]
机构
[1] Korea Adv Inst Sci & Technol, Sch Elect Engn, Daejeon, South Korea
来源
2021 IEEE/CIC INTERNATIONAL CONFERENCE ON COMMUNICATIONS IN CHINA, ICCC WORKSHOPS | 2021年
基金
新加坡国家研究基金会;
关键词
neural networks; catastrophic forgetting; continual learning; online learning; memory replay;
D O I
10.1109/ICCCWorkshops52231.2021.9538861
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Catastrophic forgetting is a well-known problem when it comes to training neural networks in the continual learning setting. The main focus of research has been on training a single network when addressing the problem. In our work, we explore a dual network approach. We propose a brain-inspired complementary dual network model for continual learning that comprises a fast learner and a slow consolidator. The fast. learner first adapts to a new task seen only once, and the slow consolidator distills the new task information using knowledge distillation from the fast learner. The two networks are trained in an alternate manner. To consolidate the learning of a new task with the learning of past tasks, we employ a small memory of each task for replay during the training of the slow consolidator. In addition, we incorporate a context-based gating mechanism on the slow consolidator, and empirically prove its positive impact on the performance of the proposed model. We show the improved results of the proposed model on several classification datasets.
引用
收藏
页码:112 / 117
页数:6
相关论文
共 25 条
[1]  
Rusu AA, 2016, Arxiv, DOI [arXiv:1606.04671, DOI 10.48550/ARXIV.1606.04671, DOI 10.43550/ARXIV:1606.04671]
[2]   Memory Aware Synapses: Learning What (not) to Forget [J].
Aljundi, Rahaf ;
Babiloni, Francesca ;
Elhoseiny, Mohamed ;
Rohrbach, Marcus ;
Tuytelaars, Tinne .
COMPUTER VISION - ECCV 2018, PT III, 2018, 11207 :144-161
[3]  
French R., 1991, Proceedings of the 13th Annual Cognitive Science Society Conference, V1, P173, DOI DOI 10.1007/978-3-030-01424-7_27
[4]   Catastrophic forgetting in connectionist networks [J].
French, RM .
TRENDS IN COGNITIVE SCIENCES, 1999, 3 (04) :128-135
[5]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[6]  
Hinton G, 2015, Arxiv, DOI arXiv:1503.02531
[7]  
Goodfellow IJ, 2015, Arxiv, DOI [arXiv:1312.6211, DOI 10.48550/ARXIV.1312.6211]
[8]  
Kemker R., 2018, INT C LEARN REPR
[9]   Overcoming catastrophic forgetting in neural networks [J].
Kirkpatricka, James ;
Pascanu, Razvan ;
Rabinowitz, Neil ;
Veness, Joel ;
Desjardins, Guillaume ;
Rusu, Andrei A. ;
Milan, Kieran ;
Quan, John ;
Ramalho, Tiago ;
Grabska-Barwinska, Agnieszka ;
Hassabis, Demis ;
Clopath, Claudia ;
Kumaran, Dharshan ;
Hadsell, Raia .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2017, 114 (13) :3521-3526
[10]  
Krizhevsky A., 2009, LEARNING MULTIPLE LA