Deep Multitask Learning for Semantic Dependency Parsing

被引:36
作者
Peng, Hao [1 ]
Thomson, Sam [2 ]
Smith, Noah A. [1 ]
机构
[1] Univ Washington, Paul G Allen Sch Comp Sci & Engn, Seattle, WA 98195 USA
[2] Carnegie Mellon Univ, Sch Comp Sci, Pittsburgh, PA 15213 USA
来源
PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1 | 2017年
关键词
D O I
10.18653/v1/P17-1186
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We present a deep neural architecture that parses sentences into three semantic dependency graph formalisms. By using efficient, nearly arc-factored inference and a bidirectional-LSTM composed with a multi-layer perceptron, our base system is able to significantly improve the state of the art for semantic dependency parsing, without using hand-engineered features or syntax. We then explore two multitask learning approaches-one that shares parameters across formalisms, and one that uses higher-order structures to predict the graphs jointly. We find that both approaches improve performance across formalisms on average, achieving a new state of the art. Our code is open-source and available at https://github.com/Noahs-ARK/NeurboParser.
引用
收藏
页码:2037 / 2048
页数:12
相关论文
共 74 条
[1]  
Ammar W., 2016, TACL, V4, P431
[2]  
Ando RK, 2005, J MACH LEARN RES, V6, P1817
[3]  
[Anonymous], ARXIV170100874
[4]  
[Anonymous], 2013, PREPRINT ARXIV 1308
[5]  
[Anonymous], P SEMEVAL
[6]  
[Anonymous], P COLING
[7]  
[Anonymous], P SEMEVAL
[8]  
[Anonymous], P SEMEVAL
[9]  
[Anonymous], P ACL
[10]  
[Anonymous], P NIPS