AMR Parsing via Graph⇆Sequence Iterative Inference

被引:0
|
作者
Cai, Deng [1 ]
Lam, Wai [1 ]
机构
[1] Chinese Univ Hong Kong, Hong Kong, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a new end-to-end model that treats AMR parsing as a series of dual decisions on the input sequence and the incrementally constructed graph. At each time step, our model performs multiple rounds of attention, reasoning, and composition that aim to answer two critical questions: (1) which part of the input sequence to abstract; and (2) where in the output graph to construct the new concept. We show that the answers to these two questions are mutually causalities. We design a model based on iterative inference that helps achieve better answers in both perspectives, leading to greatly improved parsing accuracy. Our experimental results significantly outperform all previously reported SMATCH scores by large margins. Remarkably, without the help of any large-scale pre-trained language model (e.g., BERT), our model already surpasses previous state-of-the-art using BERT. With the help of BERT, we can push the state-of-the-art results to 80.2% on LDC2017T10 (AMR 2.0) and 75.4% on LDC2014T12 (AMR 1.0).
引用
收藏
页码:1290 / 1301
页数:12
相关论文
共 50 条
  • [1] AMR Parsing as Sequence-to-Graph Transduction
    Zhang, Sheng
    Ma, Xutai
    Duh, Kevin
    Van Durme, Benjamin
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 80 - 94
  • [2] Incorporating EDS Graph for AMR Parsing
    Shou, Ziyi
    Lin, Fangzhen
    10TH CONFERENCE ON LEXICAL AND COMPUTATIONAL SEMANTICS (SEM 2021), 2021, : 202 - 211
  • [3] Ensembling Graph Predictions for AMR Parsing
    Lam, Hoang Thanh
    Picco, Gabriele
    Hou, Yufang
    Lee, Young-Suk
    Nguyen, Lam M.
    Phan, Dzung T.
    Lopez, Vanessa
    Astudillo, Ramon Fernandez
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] Guiding AMR Parsing with Reverse Graph Linearization
    Gao, Bofei
    Chen, Liang
    Wang, Peiyi
    Sui, Zhifang
    Chang, Baobao
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 13 - 26
  • [5] Sequence-to-sequence AMR Parsing with Ancestor Information
    Yu, Chen
    Gildea, Daniel
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): (SHORT PAPERS), VOL 2, 2022, : 571 - 577
  • [6] AMR Parsing as Graph Prediction with Latent Alignment
    Lyu, Chunchuan
    Titov, Ivan
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 397 - 407
  • [7] Neural AMR: Sequence-to-Sequence Models for Parsing and Generation
    Konstas, Ioannis
    Iyer, Srinivasan
    Yatskar, Mark
    Choi, Yejin
    Zettlemoyer, Luke
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 146 - 157
  • [8] A Differentiable Relaxation of Graph Segmentation and Alignment for AMR Parsing
    Lyu, Chunchuan
    Cohen, Shay B.
    Titov, Ivan
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9075 - 9091
  • [9] Graph Pre-training for AMR Parsing and Generation
    Bai, Xuefeng
    Chen, Yulong
    Zhang, Yue
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 6001 - 6015
  • [10] Improving AMR Parsing with Sequence-to-Sequence Pre-training
    Xu, Dongqin
    Li, Junhui
    Zhu, Muhua
    Min Zhang
    Zhou, Guodong
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 2501 - 2511