Generative Continual Concept Learning

被引:0
作者
Rostami, Mohammad [1 ]
Kolouri, Soheil [2 ]
McClelland, James [3 ]
Pilly, Praveen [2 ]
机构
[1] Univ Penn, Philadelphia, PA 19104 USA
[2] HRL Labs LLC, Malibu, CA USA
[3] Stanford Univ, Stanford, CA 94305 USA
来源
THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE | 2020年 / 34卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
After learning a concept, humans are also able to continually generalize their learned concepts to new domains by observing only a few labeled instances without any interference with the past learned knowledge. In contrast, learning concepts efficiently in a continual learning setting remains an open challenge for current Artificial Intelligence algorithms as persistent model retraining is necessary. Inspired by the Parallel Distributed Processing learning and the Complementary Learning Systems theories, we develop a computational model that is able to expand its previously learned concepts efficiently to new domains using a few labeled samples. We couple the new form of a concept to its past learned forms in an embedding space for effective continual learning. Doing so, a generative distribution is learned such that it is shared across the tasks in the embedding space and models the abstract concepts. This procedure enables the model to generate pseudo-data points to replay the past experience to tackle catastrophic forgetting.
引用
收藏
页码:5545 / 5552
页数:8
相关论文
共 31 条
[1]  
[Anonymous], 2014, UNDERSTANDING MACHIN, DOI [10.1017/CBO9781107298019, DOI 10.1017/CBO9781107298019]
[2]  
Ashtiani H., 2018, ADV NEURAL INFORM PR, P3412
[3]   Sliced and Radon Wasserstein Barycenters of Measures [J].
Bonneel, Nicolas ;
Rabin, Julien ;
Peyre, Gabriel ;
Pfister, Hanspeter .
JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2015, 51 (01) :22-45
[4]  
Bonnotte N., 2013, Unidimensional and evolution methods for optimal transportation
[5]   Optimal Transport for Domain Adaptation [J].
Courty, Nicolas ;
Flamary, Remi ;
Tuia, Devis ;
Rakotomamonjy, Alain .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (09) :1853-1865
[6]  
Denker JS., 1990, P ADV NEUR INF PROC, V2, P396
[7]   SLEEP The memory function of sleep [J].
Diekelmann, Susanne ;
Born, Jan .
NATURE REVIEWS NEUROSCIENCE, 2010, 11 (02) :114-126
[8]   Catastrophic forgetting in connectionist networks [J].
French, RM .
TRENDS IN COGNITIVE SCIENCES, 1999, 3 (04) :128-135
[9]  
Globerson A., 2006, ADV NEURAL INFORM PR, P451
[10]   Overcoming catastrophic forgetting in neural networks [J].
Kirkpatricka, James ;
Pascanu, Razvan ;
Rabinowitz, Neil ;
Veness, Joel ;
Desjardins, Guillaume ;
Rusu, Andrei A. ;
Milan, Kieran ;
Quan, John ;
Ramalho, Tiago ;
Grabska-Barwinska, Agnieszka ;
Hassabis, Demis ;
Clopath, Claudia ;
Kumaran, Dharshan ;
Hadsell, Raia .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2017, 114 (13) :3521-3526