Neural AMR: Sequence-to-Sequence Models for Parsing and Generation

被引:111
|
作者
Konstas, Ioannis [1 ]
Iyer, Srinivasan [1 ]
Yatskar, Mark [1 ]
Choi, Yejin [1 ]
Zettlemoyer, Luke [1 ,2 ]
机构
[1] Univ Washington, Paul G Allen Sch Comp Sci & Engn, Seattle, WA 98195 USA
[2] Allen Inst Artificial Intelligence, Seattle, WA USA
基金
美国国家科学基金会;
关键词
D O I
10.18653/v1/P17-1014
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Sequence-to-sequence models have shown strong performance across a broad range of applications. However, their application to parsing and generating text using Abstract Meaning Representation (AMR) has been limited, due to the relatively limited amount of labeled data and the non-sequential nature of the AMR graphs. We present a novel training procedure that can lift this limitation using millions of unlabeled sentences and careful preprocessing of the AMR graphs. For AMR parsing, our model achieves competitive results of 62.1 SMATCH, the current best score reported without significant use of external semantic resources. For AMR generation, our model establishes a new state-of-the-art performance of BLEU 33.8. We present extensive ablative and qualitative analysis including strong evidence that sequence-based AMR models are robust against ordering variations of graph-to-sequence conversions.
引用
收藏
页码:146 / 157
页数:12
相关论文
共 50 条
  • [21] AMR Parsing as Sequence-to-Graph Transduction
    Zhang, Sheng
    Ma, Xutai
    Duh, Kevin
    Van Durme, Benjamin
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 80 - 94
  • [22] Turkish Data-to-Text Generation Using Sequence-to-Sequence Neural Networks
    Demir, Seniz
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (02)
  • [23] PhraseTransformer: an incorporation of local context information into sequence-to-sequence semantic parsing
    Phuong Minh Nguyen
    Tung Le
    Huy Tien Nguyen
    Vu Tran
    Minh Le Nguyen
    Applied Intelligence, 2023, 53 : 15889 - 15908
  • [24] PhraseTransformer: an incorporation of local context information into sequence-to-sequence semantic parsing
    Nguyen, Phuong Minh
    Le, Tung
    Nguyen, Huy Tien
    Tran, Vu
    Nguyen, Minh Le
    APPLIED INTELLIGENCE, 2023, 53 (12) : 15889 - 15908
  • [25] Automatic Target Generation for Electronic Data Interchange using Sequence-to-Sequence Models
    Baysan, Mehmet Selman
    Kizilay, Furkan
    Gundogan, Haluk Harun
    Ozmen, Ayse Irem
    Ince, Gokhan
    INTELLIGENT AND FUZZY SYSTEMS, INFUS 2024 CONFERENCE, VOL 1, 2024, 1088 : 158 - 166
  • [26] Enriched In-Order Linearization for Faster Sequence-to-Sequence Constituent Parsing
    Fernandez-Gonzalez, Daniel
    Gomez-Rodriguez, Carlos
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 4092 - 4099
  • [27] FiD-Ex: Improving Sequence-to-Sequence Models for Extractive Rationale Generation
    Lakhotia, Kushal
    Paranjape, Bhargavi
    Ghoshal, Asish
    Yih, Wen-tau
    Mehdad, Yashar
    Iyer, Srinivasan
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3712 - 3727
  • [28] Incremental Text to Speech for Neural Sequence-to-Sequence Models using Reinforcement Learning
    Mohan, Devang S. Ram
    Lenain, Raphael
    Foglianti, Lorenzo
    Teh, Tian Huey
    Staib, Marlene
    Torresquintero, Alexandra
    Gao, Jiameng
    INTERSPEECH 2020, 2020, : 3186 - 3190
  • [29] Controlling Sequence-to-Sequence Models - A Demonstration on Neural-based Acrostic Generator
    Shen, Liang-Hsin
    Tai, Pei-Lun
    Wu, Chao-Chung
    Lin, Shou-De
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF SYSTEM DEMONSTRATIONS, 2019, : 43 - 48
  • [30] Sequence-to-Sequence Learning with Latent Neural Grammars
    Kim, Yoon
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34