Continual learning of context-dependent processing in neural networks

被引:164
|
作者
Zeng, Guanxiong [1 ,2 ,3 ]
Chen, Yang [1 ,2 ]
Cui, Bo [1 ,2 ,3 ]
Yu, Shan [1 ,2 ,3 ,4 ]
机构
[1] Chinese Acad Sci, Inst Automat, Brainnetome Ctr, Beijing, Peoples R China
[2] Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing, Peoples R China
[3] Univ Chinese Acad Sci, Beijing, Peoples R China
[4] Chinese Acad Sci, Ctr Excellence Brain Sci & Intelligence Technol, Beijing, Peoples R China
关键词
PREFRONTAL CORTEX; CONNECTIONIST MODELS; MIXED SELECTIVITY; SYSTEMS;
D O I
10.1038/s42256-019-0080-x
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks are powerful tools in learning sophisticated but fixed mapping rules between inputs and outputs, thereby limiting their application in more complex and dynamic situations in which the mapping rules are not kept the same but change according to different contexts. To lift such limits, we developed an approach involving a learning algorithm, called orthogonal weights modification, with the addition of a context-dependent processing module. We demonstrated that with orthogonal weights modification to overcome catastrophic forgetting, and the context-dependent processing module to learn how to reuse a feature representation and a classifier for different contexts, a single network could acquire numerous context-dependent mapping rules in an online and continual manner, with as few as approximately ten samples to learn each. Our approach should enable highly compact systems to gradually learn myriad regularities of the real world and eventually behave appropriately within it. When neural networks are retrained to solve more than one problem, they tend to forget what they have learned earlier. Here, the authors propose orthogonal weights modification, a method to avoid this so-called catastrophic forgetting problem. Capitalizing on such an ability, a new module is introduced to enable the network to continually learn context-dependent processing.
引用
收藏
页码:364 / 372
页数:9
相关论文
共 50 条
  • [31] Neural Correlates of Context-Dependent Feature Conjunction Learning in Visual Search Tasks
    Reavis, Eric A.
    Frank, Sebastian M.
    Greenlee, Mark W.
    Tse, Peter U.
    HUMAN BRAIN MAPPING, 2016, 37 (06) : 2319 - 2330
  • [32] Neural Evolution of Context-Dependent Fly Song
    Ding, Yun
    Lilivis, Joshua L.
    Cande, Jessica
    Berman, Gordon J.
    Arthur, Benjamin J.
    Long, Xi
    Xu, Min
    Dickson, Barry J.
    Stern, David L.
    CURRENT BIOLOGY, 2019, 29 (07) : 1089 - +
  • [33] Context-Dependent Neural Modulations in the Perception of Duration
    Murai, Yuki
    Yotsumoto, Yuko
    FRONTIERS IN INTEGRATIVE NEUROSCIENCE, 2016, 10
  • [34] CONTEXT-DEPENDENT EXPECTATIONS INFLUENCE NEURAL PROCESSING OF OBSERVED GOAL-DIRECTED ACTION
    Ondobaka, Sasha
    Wittmann, Marco
    de Lange, Floris
    Bekkering, Harold
    JOURNAL OF COGNITIVE NEUROSCIENCE, 2013, : 262 - 262
  • [35] VGG16-Based Diffractive Optical Neural Network and Context-Dependent Processing
    Zhao Xingya
    Yang Zhiwei
    Dai Jian
    Zhang Tian
    Xu Kun
    ACTA OPTICA SINICA, 2022, 42 (19)
  • [36] REGULARIZATION OF CONTEXT-DEPENDENT DEEP NEURAL NETWORKS WITH CONTEXT-INDEPENDENT MULTI-TASK TRAINING
    Bell, Peter
    Renals, Steve
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 4290 - 4294
  • [37] Continual robot learning with constructive neural networks
    Grossmann, A
    Poli, R
    LEARNING ROBOTS, PROCEEDINGS, 1998, 1545 : 95 - 108
  • [38] Continual Learning Using Bayesian Neural Networks
    Li, Honglin
    Barnaghi, Payam
    Enshaeifare, Shirin
    Ganz, Frieder
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (09) : 4243 - 4252
  • [40] Continual Learning with Sparse Progressive Neural Networks
    Ergun, Esra
    Toreyin, Behcet Ugur
    2020 28TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2020,