Continual Learning with Deep Generative Replay

被引:0
作者
Shin, Hanul [1 ,2 ]
Lee, Jung Kwon [2 ]
Kim, Jaehong [2 ]
Kim, Jiwon [2 ]
机构
[1] MIT, Cambridge, MA 02139 USA
[2] SK T Brain, Seoul, South Korea
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017) | 2017年 / 30卷
关键词
MEMORY; STABILITY;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Attempts to train a comprehensive artificial intelligence capable of solving multiple tasks have been impeded by a chronic problem called catastrophic forgetting. Although simply replaying all previous data alleviates the problem, it requires large memory and even worse, often infeasible in real world applications where the access to past data is limited. Inspired by the generative nature of the hippocampus as a short-term memory system in primate brain, we propose the Deep Generative Replay, a novel framework with a cooperative dual model architecture consisting of a deep generative model ("generator") and a task solving model ("solver"). With only these two models, training data for previous tasks can easily be sampled and interleaved with those for a new task. We test our methods in several sequential learning settings involving image classification tasks.
引用
收藏
页数:10
相关论文
共 35 条
[1]   Memory retention - the synaptic stability versus plasticity dilemma [J].
Abraham, WC ;
Robins, A .
TRENDS IN NEUROSCIENCES, 2005, 28 (02) :73-78
[2]  
[Anonymous], 2011, NEURAL INFORM PROCES
[3]  
[Anonymous], PROC CVPR IEEE
[4]  
[Anonymous], 2009, Deep boltzmann machines
[5]  
[Anonymous], 2017, CORR
[6]  
[Anonymous], 2013, P INT C NEUR INF PRO
[7]  
[Anonymous], Computer Science
[8]  
[Anonymous], 2017, ADV NEURAL INFORM PR
[9]  
[Anonymous], 1987, P 9 ANN C COGNITIVE
[10]   Avoiding catastrophic forgetting by coupling two reverberating neural networks [J].
Ans, B ;
Rousset, S .
COMPTES RENDUS DE L ACADEMIE DES SCIENCES SERIE III-SCIENCES DE LA VIE-LIFE SCIENCES, 1997, 320 (12) :989-997