Syntax-aware neural machine translation directed by syntactic dependency degree

被引:5
|
作者
Peng, Ru [1 ]
Hao, Tianyong [2 ]
Fang, Yi [1 ]
机构
[1] Guangdong Univ Technol, Sch Informat Engn, Guangzhou, Peoples R China
[2] South China Normal Univ, Sch Comp Sci, Guangzhou, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2021年 / 33卷 / 23期
基金
中国国家自然科学基金;
关键词
Syntactic dependency degree; Syntax-aware distance; Syntax-aware attentions; Neural machine translation;
D O I
10.1007/s00521-021-06256-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
There are various ways to incorporate syntax knowledge into neural machine translation (NMT). However, quantifying the dependency syntactic intimacy (DSI) between word pairs in a dependency tree has not being considered to use in attentional and transformer-based NMT. In this paper, we innovatively propose a variant of Tree-LSTM to capture the syntactic dependency degree (SDD) between word pairs in dependency trees. Two syntax-aware distances, including a tuned syntax distance and a rho-dependent distance, are proposed. For attentional NMT, two syntax-aware attentions based on two syntax-aware distances are proposed for attentional NMT, and we also design a dual attention to simultaneously generate global context and dependency syntactic context. For transformer-based NMT, we explicitly incorporate the dependency syntax into self-attention network (SAN) to propose a syntax-aware SAN. Experiments on IWSLT'17 English-German, IWSLT Chinese-English and WMT'15 English-Finnish translation tasks show that our syntax-aware NMT significantly improves translation quality by comparing with baseline methods, even the state-of-the-art transformer-based NMT.
引用
收藏
页码:16609 / 16625
页数:17
相关论文
共 50 条
  • [31] Multi-Source Syntactic Neural Machine Translation
    Currey, Anna
    Heafield, Kenneth
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 2961 - 2966
  • [32] Syntax-Informed Interactive Neural Machine Translation
    Gupta, Kamal Kumar
    Haque, Rejwanul
    Ekbal, Asif
    Bhattacharyya, Pushpak
    Way, Andy
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [33] Research on Traditional Mongolian-Chinese Neural Machine Translation Based on Dependency Syntactic Information and Transformer Model
    Qing-dao-er-ji, Ren
    Cheng, Kun
    Pang, Rui
    APPLIED SCIENCES-BASEL, 2022, 12 (19):
  • [34] A novel syntax-aware automatic graphics code generation with attention-based deep neural network
    Pang, Xiongwen
    Zhou, Yanqiang
    Li, Pengcheng
    Lin, Weiwei
    Wu, Wentai
    Wang, James Z.
    JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, 2020, 161
  • [35] Multiword Expression aware Neural Machine Translation
    Zaninello, Andrea
    Birch, Alexandra
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 3816 - 3825
  • [36] Content Word Aware Neural Machine Translation
    Chen, Kehai
    Wang, Rui
    Utiyama, Masao
    Sumita, Eiichiro
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 358 - 364
  • [37] Word Position Aware Translation Memory for Neural Machine Translation
    He, Qiuxiang
    Huang, Guoping
    Liu, Lemao
    Li, Li
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 367 - 379
  • [38] Incorporating Syntactic Knowledge in Neural Quality Estimation for Machine Translation
    Ye, Na
    Wang, Yuanyuan
    Cai, Dongfeng
    MACHINE TRANSLATION, CCMT 2019, 2019, 1104 : 23 - 34
  • [39] Improving Neural Machine Translation by Efficiently Incorporating Syntactic Templates
    Phuong Nguyen
    Tung Le
    Thanh-Le Ha
    Thai Dang
    Khanh Tran
    Kim Anh Nguyen
    Nguyen Le Minh
    ADVANCES AND TRENDS IN ARTIFICIAL INTELLIGENCE: THEORY AND PRACTICES IN ARTIFICIAL INTELLIGENCE, 2022, 13343 : 303 - 314
  • [40] Syntax-Based Attention Masking for Neural Machine Translation
    McDonald, Colin
    Chiang, David
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 47 - 52