Transformers discover an elementary calculation system exploiting local attention and grid-like problem representation

被引:2
|
作者
Cognolato, Samuel [1 ]
Testolin, Alberto [1 ]
机构
[1] Univ Padua, Padua, Italy
关键词
numerical reasoning; symbolic addition; procedural learning; extrapolation; universal transformers; external memory;
D O I
10.1109/IJCNN55064.2022.9892619
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mathematical reasoning is one of the most impressive achievements of human intellect but remains a formidable challenge for artificial intelligence systems. In this work we explore whether modern deep learning architectures can learn to solve a symbolic addition task by discovering effective arithmetic procedures. Although the problem might seem trivial at first glance, generalizing arithmetic knowledge to operations involving a higher number of terms, possibly composed by longer sequences of digits, has proven extremely challenging for neural networks. Here we show that universal transformers equipped with local attention and adaptive halting mechanisms can learn to exploit an external, grid-like memory to carry out multi-digit addition. The proposed model achieves remarkable accuracy even when tested with problems requiring extrapolation outside the training distribution; most notably, it does so by discovering human-like calculation strategies such as place value alignment.
引用
收藏
页数:8
相关论文
empty
未找到相关数据