Syntax-aware neural machine translation directed by syntactic dependency degree

被引:5
|
作者
Peng, Ru [1 ]
Hao, Tianyong [2 ]
Fang, Yi [1 ]
机构
[1] Guangdong Univ Technol, Sch Informat Engn, Guangzhou, Peoples R China
[2] South China Normal Univ, Sch Comp Sci, Guangzhou, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2021年 / 33卷 / 23期
基金
中国国家自然科学基金;
关键词
Syntactic dependency degree; Syntax-aware distance; Syntax-aware attentions; Neural machine translation;
D O I
10.1007/s00521-021-06256-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
There are various ways to incorporate syntax knowledge into neural machine translation (NMT). However, quantifying the dependency syntactic intimacy (DSI) between word pairs in a dependency tree has not being considered to use in attentional and transformer-based NMT. In this paper, we innovatively propose a variant of Tree-LSTM to capture the syntactic dependency degree (SDD) between word pairs in dependency trees. Two syntax-aware distances, including a tuned syntax distance and a rho-dependent distance, are proposed. For attentional NMT, two syntax-aware attentions based on two syntax-aware distances are proposed for attentional NMT, and we also design a dual attention to simultaneously generate global context and dependency syntactic context. For transformer-based NMT, we explicitly incorporate the dependency syntax into self-attention network (SAN) to propose a syntax-aware SAN. Experiments on IWSLT'17 English-German, IWSLT Chinese-English and WMT'15 English-Finnish translation tasks show that our syntax-aware NMT significantly improves translation quality by comparing with baseline methods, even the state-of-the-art transformer-based NMT.
引用
收藏
页码:16609 / 16625
页数:17
相关论文
共 50 条
  • [21] Metapath and syntax-aware heterogeneous subgraph neural networks for spam review detection
    Zhang, Zhiqiang
    Dong, Yuhang
    Wu, Haiyan
    Song, Haiyu
    Deng, Shengchun
    Chen, Yanhong
    APPLIED SOFT COMPUTING, 2022, 128
  • [22] Improving Neural Machine Translation with Neural Syntactic Distance
    Ma, Chunpeng
    Tamura, Akihiro
    Utiyama, Masao
    Zhao, Tiejun
    Sumita, Eiichiro
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 2032 - 2037
  • [23] METATAXIS, CONTRASTIVE DEPENDENCY SYNTAX FOR MACHINE TRANSLATION - SCHUBERT,K
    KOUTNY, I
    LINGUISTICS, 1990, 28 (03) : 612 - 617
  • [26] Improved Neural Machine Translation with Source Syntax
    Wu, Shuangzhi
    Zhou, Ming
    Zhang, Dongdong
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 4179 - 4185
  • [27] Modeling Source Syntax for Neural Machine Translation
    Li, Junhui
    Xiong, Deyi
    Tu, Zhaopeng
    Zhu, Muhua
    Zhang, Min
    Zhou, Guodong
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 688 - 697
  • [28] Improve Neural Machine Translation by Syntax Tree
    Chen, Siyu
    Yu, Qingsong
    ISCSIC'18: PROCEEDINGS OF THE 2ND INTERNATIONAL SYMPOSIUM ON COMPUTER SCIENCE AND INTELLIGENT CONTROL, 2018,
  • [29] Sequence-to-Dependency Neural Machine Translation
    Wu, Shuangzhi
    Zhang, Dongdong
    Yang, Nan
    Li, Mu
    Zhou, Ming
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 698 - 707
  • [30] Synchronous Syntactic Attention for Transformer Neural Machine Translation
    Deguchi, Hiroyuki
    Tamura, Akihiro
    Ninomiya, Takashi
    ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING: PROCEEDINGS OF THE STUDENT RESEARCH WORKSHOP, 2021, : 348 - 355