Research on Traditional Mongolian-Chinese Neural Machine Translation Based on Dependency Syntactic Information and Transformer Model

被引:3
|
作者
Qing-dao-er-ji, Ren [1 ]
Cheng, Kun [1 ,2 ]
Pang, Rui [1 ,2 ]
机构
[1] Inner Mongolia Univ Technol, Sch Informat Engn, Hohhot 010051, Peoples R China
[2] Inner Mongolia Power Grp CO Ltd, Informat & Commun Div, Wuhai Power Supply Co, Wuhai 016000, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2022年 / 12卷 / 19期
基金
中国国家自然科学基金;
关键词
attention mechanism; dependency parsing; convolutional neural networks; Transformer;
D O I
10.3390/app121910074
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Neural machine translation (NMT) is a data-driven machine translation approach that has proven its superiority in large corpora, but it still has much room for improvement when the corpus resources are not abundant. This work aims to improve the translation quality of Traditional Mongolian-Chinese (MN-CH). First, the baseline model is constructed based on the Transformer model, and then two different syntax-assisted learning units are added to the encoder and decoder. Finally, the encoder's ability to learn Traditional Mongolian syntax is implicitly strengthened, and the knowledge of Chinese-dependent syntax is taken as prior knowledge to explicitly guide the decoder to learn Chinese syntax. The average BLEU values measured under two experimental conditions showed that the proposed improved model improved by 6.706 (45.141-38.435) and 5.409 (41.930-36.521) compared with the baseline model. The analysis of the experimental results also revealed that the proposed improved model was still deficient in learning Chinese syntax, and then the Primer-EZ method was introduced to ameliorate this problem, leading to faster convergence and better translation quality. The final improved model had an average BLEU value increase of 9.113 (45.634-36.521) compared with the baseline model at experimental conditions of N = 5 and epochs = 35. The experiments showed that both the proposed model architecture and prior knowledge could effectively lead to an increase in BLEU value, and the addition of syntactic-assisted learning units not only corrected the initial association but also alleviated the long-term dependence between words.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Introduction of Traditional Mongolian-Chinese Machine Translation
    Wu, J.
    Hou, H. X.
    Monghjaya
    Bao, F. L.
    Xie, C. J.
    PROCEEDINGS OF THE 2015 INTERNATIONAL CONFERENCE ON ELECTRICAL, AUTOMATION AND MECHANICAL ENGINEERING (EAME 2015), 2015, 13 : 357 - 360
  • [2] The Research on Morpheme-Based Mongolian-Chinese Neural Machine Translation
    Wang, Siriguleng
    Wuyuntana
    2019 IEEE 2ND INTERNATIONAL CONFERENCE ON INFORMATION AND COMPUTER TECHNOLOGIES (ICICT), 2019, : 138 - 142
  • [3] Research on Mongolian-Chinese machine translation based on the end-to-end neural network
    Qing-Dao-Er-Ji, Ren
    Su, Yila
    Wu, Nier
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2020, 18 (01)
  • [4] Mongolian-Chinese Neural Machine Translation Based on Sustained Transfer Learning
    Wang, Bailun
    Ji, Yatu
    Wu, Nier
    Liu, Xu
    Wang, Yanli
    Mao, Rui
    Yuan, Shuai
    Ren, Qing-Dao-Er-Ji
    Liu, Na
    Zhuang, Xufei
    Lu, Min
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT III, ICIC 2024, 2024, 14877 : 304 - 315
  • [5] Template-Based Model for Mongolian-Chinese Machine Translation
    Wu, Jing
    Hou, Hongxu
    Bao, Feilong
    Jiang, Yupeng
    JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS, 2016, 20 (06) : 893 - 901
  • [6] Key Research of Pre-processing on Mongolian-Chinese Neural Machine Translation
    Du, Jian
    Hou, Hongxu
    Wu, Jing
    Shen, Zhipeng
    Li, Jinting
    Wang, Hongbin
    PROCEEDINGS OF THE 2016 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND INDUSTRIAL ENGINEERING (AIIE 2016), 2016, 133 : 1 - 6
  • [7] Research on Unknown Words Processing of Mongolian-Chinese Neural Machine Translation Based on Semantic Similarity
    Hasigaowa
    Wang, Siriguleng
    2019 IEEE 4TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS (ICCCS 2019), 2019, : 370 - 374
  • [8] Research on Mongolian-Chinese Translation Model Based on Transformer with Soft Context Data Augmentation Technique
    Ren, Qing-dao-er-ji
    Li, Yuan
    Bao, Shi
    Liu, Yong-chao
    Chen, Xiu-hong
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2022, E105A (05) : 871 - 876
  • [9] Improving Mongolian-Chinese Neural Machine Translation with Morphological Noise
    Ji, Yatu
    Hou, Hongxu
    Wu, Nier
    Chen, Junjie
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019:): STUDENT RESEARCH WORKSHOP, 2019, : 123 - 135
  • [10] A Mongolian-Chinese Neural Machine Translation Model Based on Soft Target Templates and Contextual Knowledge
    Ren, Qing-Dao-Er-Ji
    Pang, Ziyu
    Lang, Jiajun
    APPLIED SCIENCES-BASEL, 2023, 13 (21):