Syntax-aware neural machine translation directed by syntactic dependency degree

被引:5
|
作者
Peng, Ru [1 ]
Hao, Tianyong [2 ]
Fang, Yi [1 ]
机构
[1] Guangdong Univ Technol, Sch Informat Engn, Guangzhou, Peoples R China
[2] South China Normal Univ, Sch Comp Sci, Guangzhou, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2021年 / 33卷 / 23期
基金
中国国家自然科学基金;
关键词
Syntactic dependency degree; Syntax-aware distance; Syntax-aware attentions; Neural machine translation;
D O I
10.1007/s00521-021-06256-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
There are various ways to incorporate syntax knowledge into neural machine translation (NMT). However, quantifying the dependency syntactic intimacy (DSI) between word pairs in a dependency tree has not being considered to use in attentional and transformer-based NMT. In this paper, we innovatively propose a variant of Tree-LSTM to capture the syntactic dependency degree (SDD) between word pairs in dependency trees. Two syntax-aware distances, including a tuned syntax distance and a rho-dependent distance, are proposed. For attentional NMT, two syntax-aware attentions based on two syntax-aware distances are proposed for attentional NMT, and we also design a dual attention to simultaneously generate global context and dependency syntactic context. For transformer-based NMT, we explicitly incorporate the dependency syntax into self-attention network (SAN) to propose a syntax-aware SAN. Experiments on IWSLT'17 English-German, IWSLT Chinese-English and WMT'15 English-Finnish translation tasks show that our syntax-aware NMT significantly improves translation quality by comparing with baseline methods, even the state-of-the-art transformer-based NMT.
引用
收藏
页码:16609 / 16625
页数:17
相关论文
共 50 条
  • [41] Enhancing Machine Translation with Dependency-Aware Self-Attention
    Bugliarello, Emanuele
    Okazaki, Naoaki
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 1618 - 1627
  • [42] Challenges in Context-Aware Neural Machine Translation
    Jinn, Linghao
    Het, Jacqueline
    May, Jonathan
    Ma, Xuezhe
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 15246 - 15263
  • [43] Linguistic Knowledge-Aware Neural Machine Translation
    Li, Qiang
    Wong, Derek F.
    Chao, Lidia S.
    Zhu, Muhua
    Xiao, Tong
    Zhu, Jingbo
    Zhang, Min
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2018, 26 (12) : 2341 - 2354
  • [44] Improving semi-autoregressive machine translation with the guidance of syntactic dependency parsing structure
    Chen, Xinran
    Duan, Sufeng
    Liu, Gongshen
    NEUROCOMPUTING, 2025, 614
  • [45] Quality-Aware Decoding for Neural Machine Translation
    Fernandes, Patrick
    Farinhas, Antonio
    Rei, Ricardo
    de Souza, Jose G. C.
    Ogayo, Perez
    Neubig, Graham
    Martins, Andre F. T.
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 1396 - 1412
  • [46] Optimization of Unsupervised Neural Machine Translation Based on Syntactic Knowledge Improvement
    Zhou, Aiping
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (11) : 90 - 99
  • [47] Semantic and syntactic information for neural machine translation Injecting Features to the Transformer
    Armengol-Estape, Jordi
    Costa-jussa, Marta R.
    MACHINE TRANSLATION, 2021, 35 (01) : 3 - 17
  • [48] Neural Machine Translation with Attention Based on a New Syntactic Branch Distance
    Peng, Ru
    Chen, Zhitao
    Hao, Tianyong
    Fang, Yi
    MACHINE TRANSLATION, CCMT 2019, 2019, 1104 : 47 - 57
  • [49] Handling syntactic difference in Chinese-Vietnamese neural machine translation
    Yu, Zhiqiang
    Wang, Ting
    Liu, Shihu
    Tan, Xuewen
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2024, 46 (03) : 5533 - 5544
  • [50] Self-supervised Bilingual Syntactic Alignment for Neural Machine Translation
    Zhang, Tianfu
    Huang, Heyan
    Feng, Chong
    Cao, Longbing
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14454 - 14462