Improving tree-based neural machine translation with dynamic lexicalized dependency encoding

被引:14
|
作者
Yang, Baosong [1 ]
Wong, Derek F. [1 ]
Chao, Lidia S. [1 ]
Zhang, Min [2 ]
机构
[1] Univ Macau, Nat Language Proc & Portuguese Chinese Machine Tr, Dept Comp & Informat Sci, Macau, Peoples R China
[2] Soochow Univ, Inst Artificial Intelligence, Suzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Syntactic modeling; Dynamic parameters; Tree-RNN; Neural machine translation (NMT);
D O I
10.1016/j.knosys.2019.105042
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tree-to-sequence neural machine translation models have proven to be effective in learning the semantic representations from the exploited syntactic structure. Despite their success, tree-to-sequence models have two major issues: (1) the embeddings of constituents at the higher tree levels tend to contribute less in translation; and (2) using a single set of model parameters is difficult to fully capture the syntactic and semantic richness of linguistic phrases. To address the first problem, we proposed a lexicalized dependency model, in which the source-side lexical representations are learned in a head-dependent fashion following a dependency graph. Since the number of dependents is variable, we proposed a variant recurrent neural network (RNN) to jointly consider the long-distance dependencies and the sequential information of words. Concerning the second problem, we adopt a latent vector to dynamically condition the parameters for the composition of each node representation. Experimental results reveal that the proposed model significantly outperforms the recently proposed tree-based methods in English-Chinese and English-German translation tasks with even far fewer parameters. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] A Tree-based Decoder for Neural Machine Translation
    Wang, Xinyi
    Pham, Hieu
    Yin, Pengcheng
    Neubig, Graham
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4772 - 4777
  • [2] Loosely tree-based alignment for machine translation
    Gildea, D
    41ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, PROCEEDINGS OF THE CONFERENCE, 2003, : 80 - 87
  • [3] Regression tree-based dependency modeling for dynamic systems
    Van Welden, DF
    Kerckhoffs, EJH
    SIMULATION AND MODELLING: ENABLERS FOR A BETTER QUALITY OF LIFE, 2000, : 219 - 226
  • [4] A neural reordering model based on phrasal dependency tree for statistical machine translation
    Farzi, Saeed
    Faili, Heshaam
    Kianian, Sahar
    INTELLIGENT DATA ANALYSIS, 2018, 22 (05) : 1163 - 1183
  • [5] Improving Neural Machine Translation Model with Deep Encoding Information
    Guiduo Duan
    Haobo Yang
    Ke Qin
    Tianxi Huang
    Cognitive Computation, 2021, 13 : 972 - 980
  • [6] Improving Neural Machine Translation Model with Deep Encoding Information
    Duan, Guiduo
    Yang, Haobo
    Qin, Ke
    Huang, Tianxi
    COGNITIVE COMPUTATION, 2021, 13 (04) : 972 - 980
  • [7] Dependency-to-Dependency Neural Machine Translation
    Wu, Shuangzhi
    Zhang, Dongdong
    Zhang, Zhirui
    Yang, Nan
    Li, Mu
    Zhou, Ming
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2018, 26 (11) : 2132 - 2141
  • [8] Dynamic Programming Encoding for Subword Segmentation in Neural Machine Translation
    He, Xuanli
    Haffari, Gholamreza
    Norouzi, Mohammad
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3042 - 3051
  • [9] Improving Neural Machine Translation Using Rule-Based Machine Translation
    Singh, Muskaan
    Kumar, Ravinder
    Chana, Inderveer
    2019 7TH INTERNATIONAL CONFERENCE ON SMART COMPUTING & COMMUNICATIONS (ICSCC), 2019, : 8 - 12
  • [10] Encoding Gated Translation Memory into Neural Machine Translation
    Cao, Qian
    Xiong, Deyi
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3042 - 3047