Neural AMR: Sequence-to-Sequence Models for Parsing and Generation

被引:113
作者
Konstas, Ioannis [1 ]
Iyer, Srinivasan [1 ]
Yatskar, Mark [1 ]
Choi, Yejin [1 ]
Zettlemoyer, Luke [1 ,2 ]
机构
[1] Univ Washington, Paul G Allen Sch Comp Sci & Engn, Seattle, WA 98195 USA
[2] Allen Inst Artificial Intelligence, Seattle, WA USA
来源
PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1 | 2017年
基金
美国国家科学基金会;
关键词
D O I
10.18653/v1/P17-1014
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Sequence-to-sequence models have shown strong performance across a broad range of applications. However, their application to parsing and generating text using Abstract Meaning Representation (AMR) has been limited, due to the relatively limited amount of labeled data and the non-sequential nature of the AMR graphs. We present a novel training procedure that can lift this limitation using millions of unlabeled sentences and careful preprocessing of the AMR graphs. For AMR parsing, our model achieves competitive results of 62.1 SMATCH, the current best score reported without significant use of external semantic resources. For AMR generation, our model establishes a new state-of-the-art performance of BLEU 33.8. We present extensive ablative and qualitative analysis including strong evidence that sequence-based AMR models are robust against ordering variations of graph-to-sequence conversions.
引用
收藏
页码:146 / 157
页数:12
相关论文
共 35 条
[1]  
[Anonymous], 2016, P 2016 C EMNLP
[2]  
[Anonymous], 2016, CORR
[3]  
[Anonymous], 2016, 2016 C EMP METH NAT
[4]  
[Anonymous], 2015, P 2015 C EMPIRICAL M, DOI DOI 10.18653/V1/D15-1166
[5]  
[Anonymous], 2005, P 43 ANN M ASS COMP
[6]  
[Anonymous], 2015, P 2015 C NAACL HLT
[7]  
Artzi Y., 2015, EMNLP, P1699
[8]  
Bahdanau Dzmitry, 2015, P 2015 INT C LEARN R
[9]  
Barzdins Guntis, 2016, CORRABS160401278
[10]  
Bender EM, 2014, LREC 2014 - NINTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, P2447