Syntax-Aware Complex-Valued Neural Machine Translation

被引:0
|
作者
Liu, Yang [1 ]
Hou, Yuexian [1 ]
机构
[1] Tianjin Univ, Coll Intelligence & Comp, Tianjin, Peoples R China
关键词
Neural machine translation; Attention mechanism; Complex-valued neural network;
D O I
10.1007/978-3-031-44192-9_38
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Syntax has been proven to be remarkably effective in neural machine translation (NMT). Previous models obtained syntax information from syntactic parsing tools and integrated it into NMT models to improve translation performance. In this work, we propose a method to incorporate syntax information into a complex-valued Encoder-Decoder architecture. The proposed model jointly learns word-level and syntax-level attention scores from the source side to the target side using an attention mechanism. Importantly, it is not dependent on specific network architectures and can be directly integrated into any existing sequence-to-sequence (Seq2Seq) framework. The experimental results demonstrate that the proposed method can bring significant improvements in BLEU scores on two datasets. In particular, the proposed method achieves a greater improvement in BLEU scores in translation tasks involving language pairs with significant syntactic differences.
引用
收藏
页码:474 / 485
页数:12
相关论文
共 50 条
  • [1] Syntax-aware Transformer Encoder for Neural Machine Translation
    Duan, Sufeng
    Zhao, Hai
    Zhou, Junru
    Wang, Rui
    PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2019, : 396 - 401
  • [2] Syntax-Aware Data Augmentation for Neural Machine Translation
    Duan, Sufeng
    Zhao, Hai
    Zhang, Dongdong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 2988 - 2999
  • [3] Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder
    Chen, Huadong
    Huang, Shujian
    Chiang, David
    Chen, Jiajun
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 1936 - 1945
  • [4] Recurrent graph encoder for syntax-aware neural machine translation
    Liang Ding
    Longyue Wang
    Siyou Liu
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 1053 - 1062
  • [5] Recurrent graph encoder for syntax-aware neural machine translation
    Ding, Liang
    Wang, Longyue
    Liu, Siyou
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (04) : 1053 - 1062
  • [6] Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations
    Zhang, Meishan
    Li, Zhenghua
    Fu, Guohong
    Zhang, Min
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 1151 - 1161
  • [7] Syntax-aware neural machine translation directed by syntactic dependency degree
    Ru Peng
    Tianyong Hao
    Yi Fang
    Neural Computing and Applications, 2021, 33 : 16609 - 16625
  • [8] Syntax-aware neural machine translation directed by syntactic dependency degree
    Peng, Ru
    Hao, Tianyong
    Fang, Yi
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (23): : 16609 - 16625
  • [9] Syntax-Aware Neural Semantic Role Labeling
    Xia, Qingrong
    Li, Zhenghua
    Zhang, Min
    Zhang, Meishan
    Fu, Guohong
    Wang, Rui
    Si, Luo
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 7305 - 7313
  • [10] Syntax-aware entity representations for neural relation extraction
    He, Zhengqiu
    Chen, Wenliang
    Li, Zhenghua
    Zhang, Wei
    Shao, Hao
    Zhang, Min
    ARTIFICIAL INTELLIGENCE, 2019, 275 : 602 - 617