Neural AMR: Sequence-to-Sequence Models for Parsing and Generation

被引:111
|
作者
Konstas, Ioannis [1 ]
Iyer, Srinivasan [1 ]
Yatskar, Mark [1 ]
Choi, Yejin [1 ]
Zettlemoyer, Luke [1 ,2 ]
机构
[1] Univ Washington, Paul G Allen Sch Comp Sci & Engn, Seattle, WA 98195 USA
[2] Allen Inst Artificial Intelligence, Seattle, WA USA
基金
美国国家科学基金会;
关键词
D O I
10.18653/v1/P17-1014
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Sequence-to-sequence models have shown strong performance across a broad range of applications. However, their application to parsing and generating text using Abstract Meaning Representation (AMR) has been limited, due to the relatively limited amount of labeled data and the non-sequential nature of the AMR graphs. We present a novel training procedure that can lift this limitation using millions of unlabeled sentences and careful preprocessing of the AMR graphs. For AMR parsing, our model achieves competitive results of 62.1 SMATCH, the current best score reported without significant use of external semantic resources. For AMR generation, our model establishes a new state-of-the-art performance of BLEU 33.8. We present extensive ablative and qualitative analysis including strong evidence that sequence-based AMR models are robust against ordering variations of graph-to-sequence conversions.
引用
收藏
页码:146 / 157
页数:12
相关论文
共 50 条
  • [1] Sequence-to-sequence AMR Parsing with Ancestor Information
    Yu, Chen
    Gildea, Daniel
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): (SHORT PAPERS), VOL 2, 2022, : 571 - 577
  • [2] Improving AMR Parsing with Sequence-to-Sequence Pre-training
    Xu, Dongqin
    Li, Junhui
    Zhu, Muhua
    Min Zhang
    Zhou, Guodong
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 2501 - 2511
  • [3] Unleashing the True Potential of Sequence-to-Sequence Models for Sequence Tagging and Structure Parsing
    He, Han
    Choi, Jinho D.
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2023, 11 : 582 - 599
  • [4] Improving Sequence-to-Sequence Constituency Parsing
    Liu, Lemao
    Zhu, Muhua
    Shi, Shuming
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 4873 - 4880
  • [5] Concept Identification with Sequence-to-Sequence Models in Abstract Meaning Representation Parsing
    Batiz, Orsolya Bernadeu
    Helmer, Robert Paul
    Pop, Roxana
    Macicasan, Florin
    Lemnaru, Camelia
    2020 IEEE 16TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING (ICCP 2020), 2020, : 83 - 90
  • [6] Neural Abstractive Text Summarization with Sequence-to-Sequence Models
    Shi, Tian
    Keneshloo, Yaser
    Ramakrishnan, Naren
    Reddy, Chandan K.
    ACM/IMS Transactions on Data Science, 2021, 2 (01):
  • [7] Deterministic Attention for Sequence-to-Sequence Constituent Parsing
    Ma, Chunpeng
    Liu, Lemao
    Tamura, Akihiro
    Zhao, Tiejun
    Sumita, Eiichiro
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 3237 - 3243
  • [8] Persian Keyphrase Generation Using Sequence-to-sequence Models
    Doostmohammadi, Ehsan
    Bokaei, Mohammad Hadi
    Sameti, Hossein
    2019 27TH IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE 2019), 2019, : 2010 - 2015
  • [9] Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing
    Zhou, Jiawei
    Naseem, Tahira
    Astudillo, Ramon Fernandez
    Lee, Young-Suk
    Florian, Radu
    Roukos, Salim
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 6279 - 6290
  • [10] Graph augmented sequence-to-sequence model for neural question generation
    Ma, Hui
    Wang, Jian
    Lin, Hongfei
    Xu, Bo
    APPLIED INTELLIGENCE, 2023, 53 (11) : 14628 - 14644