An Empirical Study of Encoders and Decoders in Graph-Based Dependency Parsing

被引:1
|
作者
Wang, Ge [1 ,2 ,3 ]
Hu, Ziyuan [1 ]
Hu, Zechuan [1 ]
Tu, Kewei [1 ]
机构
[1] ShanghaiTech Univ, Sch Informat Sci & Technol, Shanghai 201210, Peoples R China
[2] Chinese Acad Sci, Shanghai Inst Microsyst & Informat Technol, Shanghai 200050, Peoples R China
[3] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
来源
IEEE ACCESS | 2020年 / 8卷
基金
中国国家自然科学基金;
关键词
Dependency parsing; high-order model; neural network;
D O I
10.1109/ACCESS.2020.2974109
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Graph-based dependency parsing consists of two steps: first, an encoder produces a feature representation for each parsing substructure of the input sentence, which is then used to compute a score for the substructure; and second, a decoder finds the parse tree whose substructures have the largest total score. Over the past few years, powerful neural techniques have been introduced into the encoding step which substantially increases parsing accuracies. However, advanced decoding techniques, in particular high-order decoding, have seen a decline in usage. It is widely believed that contextualized features produced by neural encoders can help capture high-order decoding information and hence diminish the need for a high-order decoder. In this paper, we empirically evaluate the combinations of different neural and non-neural encoders with first- and second-order decoders and provide a comprehensive analysis about the effectiveness of these combinations with varied training data sizes. We find that: first, when there is large training data, a strong neural encoder with first-order decoding is sufficient to achieve high parsing accuracy and only slightly lags behind the combination of neural encoding and second-order decoding; second, with small training data, a non-neural encoder with a second-order decoder outperforms the other combinations in most cases.
引用
收藏
页码:35770 / 35776
页数:7
相关论文
共 50 条
  • [21] STRESS DISTRIBUTION BASED ON DEPENDENCY PARSING OF CHINESE DISCOURSE
    Wang, Yaru
    Li, Aijun
    Jia, Yuan
    2014 17TH ORIENTAL CHAPTER OF THE INTERNATIONAL COMMITTEE FOR THE CO-ORDINATION AND STANDARDIZATION OF SPEECH DATABASES AND ASSESSMENT TECHNIQUES (COCOSDA), 2014,
  • [22] Improving the performance of graph based dependency parsing by guiding bi-affine layer with augmented global and local features
    Altintas, Mucahit
    Tantug, A. Cuneyd
    INTELLIGENT SYSTEMS WITH APPLICATIONS, 2023, 18
  • [23] Reinforcement of BERT with Dependency-Parsing Based Attention Mask
    Mechouma, Toufik
    Biskri, Ismail
    Meunier, Jean Guy
    ADVANCES IN COMPUTATIONAL COLLECTIVE INTELLIGENCE, ICCCI 2022, 2022, 1653 : 112 - 122
  • [24] Vietnamese Transition-based Dependency Parsing with Supertag Features
    Nguyen, Kiet V.
    Ngan Luu-Thuy Nguyen
    2016 EIGHTH INTERNATIONAL CONFERENCE ON KNOWLEDGE AND SYSTEMS ENGINEERING (KSE), 2016, : 175 - 180
  • [25] Apply a rough set-based classifier to dependency parsing
    Ji, Yangsheng
    Shang, Lin
    Dai, Xinyu
    Ma, Ruoce
    ROUGH SETS AND KNOWLEDGE TECHNOLOGY, 2008, 5009 : 97 - 105
  • [26] Dependency Parsing of Estonian: Statistical and Rule-based Approaches
    Muischnek, Kadri
    Mueuerisep, Kaili
    Puolakainen, Tiina
    HUMAN LANGUAGE TECHNOLOGIES - THE BALTIC PERSPECTIVE, BALTIC HLT 2014, 2014, 268 : 111 - +
  • [27] Tamil Dependency Parsing: Results Using Rule Based and Corpus Based Approaches
    Ramasamy, Loganathan
    Zabokrtsky, Zdenek
    COMPUTATIONAL LINGUISTICS AND INTELLIGENT TEXT PROCESSING, PT I, 2011, 6608 : 82 - 95
  • [28] A GENETIC GRAPH-BASED APPROACH FOR PARTITIONAL CLUSTERING
    Menendez, Hector D.
    Barrero, David F.
    Camacho, David
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2014, 24 (03)
  • [29] A Graph-Based Approach for Relating Integer Programs
    Steever, Zachary
    Hunt, Kyle
    Karwan, Mark
    Yuan, Junsong
    Murray, Chase C.
    INFORMS JOURNAL ON COMPUTING, 2024, 36 (06) : 1715 - 1736
  • [30] DAT-MT Accelerated Graph Fusion Dependency Parsing Model for Small Samples in Professional Fields
    Li, Rui
    Shu, Shili
    Wang, Shunli
    Liu, Yang
    Li, Yanhao
    Peng, Mingjun
    ENTROPY, 2023, 25 (10)