Neural AMR: Sequence-to-Sequence Models for Parsing and Generation

被引:111
|
作者
Konstas, Ioannis [1 ]
Iyer, Srinivasan [1 ]
Yatskar, Mark [1 ]
Choi, Yejin [1 ]
Zettlemoyer, Luke [1 ,2 ]
机构
[1] Univ Washington, Paul G Allen Sch Comp Sci & Engn, Seattle, WA 98195 USA
[2] Allen Inst Artificial Intelligence, Seattle, WA USA
基金
美国国家科学基金会;
关键词
D O I
10.18653/v1/P17-1014
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Sequence-to-sequence models have shown strong performance across a broad range of applications. However, their application to parsing and generating text using Abstract Meaning Representation (AMR) has been limited, due to the relatively limited amount of labeled data and the non-sequential nature of the AMR graphs. We present a novel training procedure that can lift this limitation using millions of unlabeled sentences and careful preprocessing of the AMR graphs. For AMR parsing, our model achieves competitive results of 62.1 SMATCH, the current best score reported without significant use of external semantic resources. For AMR generation, our model establishes a new state-of-the-art performance of BLEU 33.8. We present extensive ablative and qualitative analysis including strong evidence that sequence-based AMR models are robust against ordering variations of graph-to-sequence conversions.
引用
收藏
页码:146 / 157
页数:12
相关论文
共 50 条
  • [41] Learning Neural Sequence-to-Sequence Models from Weak Feedback with Bipolar Ramp Loss
    Jehl, Laura
    Lawrence, Carotin
    Riezler, Stefan
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2019, 7 : 233 - 248
  • [42] Named Entity Transliteration with Sequence-to-Sequence Neural Network
    Li, Zhongwei
    Chng, Eng Siong
    Li, Haizhou
    2017 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2017, : 374 - 378
  • [43] Automated Integration of Genomic Metadata with Sequence-to-Sequence Models
    Cannizzaro, Giuseppe
    Leone, Michele
    Bernasconi, Anna
    Canakoglu, Arif
    Carman, Mark J.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2020, PT V, 2021, 12461 : 187 - 203
  • [44] Bandit Structured Prediction for Neural Sequence-to-Sequence Learning
    Kreutzer, Julia
    Sokolov, Artem
    Riezler, Stefan
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 1503 - 1513
  • [45] A Fuzzy Training Framework for Controllable Sequence-to-Sequence Generation
    Li, Jiajia
    Wang, Ping
    Li, Zuchao
    Liu, Xi
    Utiyama, Masao
    Sumita, Eiichiro
    Zhao, Hai
    Ai, Haojun
    IEEE ACCESS, 2022, 10 : 92467 - 92480
  • [46] Enhancing Sequence-to-Sequence Neural Lemmatization with External Resources
    Milintsevich, Kirill
    Sirts, Kairit
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 3112 - 3122
  • [47] Encoding Emotional Information for Sequence-to-Sequence Response Generation
    Chan, Yin Hei
    Lui, Andrew Kwok Fai
    2018 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND BIG DATA (ICAIBD), 2018, : 113 - 116
  • [48] Myanmar News Headline Generation with Sequence-to-Sequence model
    Thu, Yamin
    Pa, Win Pa
    PROCEEDINGS OF 2020 23RD CONFERENCE OF THE ORIENTAL COCOSDA INTERNATIONAL COMMITTEE FOR THE CO-ORDINATION AND STANDARDISATION OF SPEECH DATABASES AND ASSESSMENT TECHNIQUES (ORIENTAL-COCOSDA 2020), 2020, : 117 - 122
  • [49] AMR Parsing via Graph⇆Sequence Iterative Inference
    Cai, Deng
    Lam, Wai
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 1290 - 1301
  • [50] EFFECT OF DATA REDUCTION ON SEQUENCE-TO-SEQUENCE NEURAL TTS
    Latorre, Javier
    Lachowicz, Jakub
    Lorenzo-Trueba, Jaime
    Merritt, Thomas
    Drugman, Thomas
    Ronanki, Srikanth
    Klimkov, Viacheslav
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 7075 - 7079