Compositional generalization through meta sequence-to-sequence learning

被引:0
作者
Lake, Brenden M. [1 ]
机构
[1] NYU, Facebook AI Reasearch, New York, NY 10003 USA
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019) | 2019年 / 32卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
People can learn a new concept and use it compositionally, understanding how to "blicket twice" after learning how to "blicket." In contrast, powerful sequence-to-sequence (seq2seq) neural networks fail such tests of compositionality, especially when composing new concepts together with existing concepts. In this paper, I show how memory-augmented neural networks can be trained to generalize compositionally through meta seq2seq learning. In this approach, models train on a series of seq2seq problems to acquire the compositional skills needed to solve new seq2seq problems. Meta se2seq learning solves several of the SCAN tests for compositional learning and can learn to apply implicit rules to variables.
引用
收藏
页数:11
相关论文
共 38 条
  • [1] Andreas Jacob, 2019, GOOD ENOUGH COMPOSIT
  • [2] Bahdanau Dzmitry, 2018, SYSTEMATIC GEN WHAT, P1
  • [3] Bastings Joost, 2018, P 2018 EMNLP WORKSHO, P47, DOI 10.18653/v1/W18-5407
  • [4] Bojar O., 2016, SHARED TASK PAPERS, P131, DOI [DOI 10.18653/V1/W16-2301, 10.18653/v1/W16-2301]
  • [5] Dasgupta I., 2018, EVALUATING COMPOSITI
  • [6] Finn C, 2017, PR MACH LEARN RES, V70
  • [7] CONNECTIONISM AND COGNITIVE ARCHITECTURE - A CRITICAL ANALYSIS
    FODOR, JA
    PYLYSHYN, ZW
    [J]. COGNITION, 1988, 28 (1-2) : 3 - 71
  • [8] Gandhi Kanishk, 2019, MUTUAL EXCLUSIVITY C
  • [9] Gershman S.J., 2015, P 37 ANN C COGN SCI
  • [10] Graves A, 2012, STUD COMPUT INTELL, V385, P1, DOI [10.1162/neco.1997.9.1.1, 10.1007/978-3-642-24797-2]