A Survey on Low-resource Neural Machine Translation

被引:0
|
作者
Li H.-Z. [1 ,2 ,3 ,4 ]
Feng C. [1 ,2 ]
Huang H.-Y. [1 ,2 ]
机构
[1] School of Computer Science and Technology, Beijing Institute of Technology, Beijing
[2] Beijing Engineering Research Center of High-Volume Language Information Processing and Cloud Computing Applications, Beijing
[3] School of Foreign Languages, Beijing Institute of Technology, Beijing
[4] Key Laboratory of Language, Cognition and Computation Ministry of Industry and Information Technology, Beijing
来源
Huang, He-Yan (hhy63@bit.edu.cn) | 1600年 / Science Press卷 / 47期
基金
国家重点研发计划; 中国国家自然科学基金; 中国博士后科学基金;
关键词
Back translation; Low-resource language; Multilingual translation; Neural machine translation; Pivot language; Transfer learning; Unsupervised translation;
D O I
10.16383/j.aas.c200103
中图分类号
学科分类号
摘要
As the mainstream approach in the field of machine translation, neural machine translation (NMT) has achieved great improvements on many rich-source languages, but performance of NMT for low-resource languages ae not very good yet. Low-resource NMT has been one of the most popular issues in MT and attracted wide attention around the world in recent years. This paper presents a survey on low-resource NMT research. We first introduce some related academic activities and feasible data sets for the translation, then categorize and summarize several types of approaches mainly used in low-resource NMT in detail, and present their features, as well as relations between them, and describe the current research status. Finally, we propose some advices on possible research trends and directions of this field in the future. Copyright © 2021 Acta Automatica Sinica. All rights reserved.
引用
收藏
页码:1217 / 1231
页数:14
相关论文
共 119 条
  • [1] Kalchbrenner N, Blunsom P., Recurrent continuous translation models, Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1700-1709, (2013)
  • [2] Sutskever I, Vinyals O, Le Q V., Sequence to sequence learning with neural networks, Proceedings of the 27th International Conference on Neural Information Processing Systems, pp. 3104-3112, (2014)
  • [3] Bahdanau D, Cho K, Bengio Y., Neural machine translation by jointly learning to align and translate, Proceedings of the 3rd International Conference on Learning Representations, (2015)
  • [4] Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, Et al., Attention is all you need, Proceedings of the 31st International Conference on Neural Information Processing Systems, pp. 6000-6010, (2017)
  • [5] Liu Yang, Recent advances in neural machine translation, Journal of Computer Research and Development, 54, 6, pp. 1144-1149, (2017)
  • [6] Li Ya-Chao, Xiong De-Yi, Zhang Min, A survey of neural machine translation, Chinese Journal of Computers, 41, 12, pp. 2734-2755, (2018)
  • [7] Lin Qian, Liu Qing, Su Jin-Song, Lin Huan, Yang Jing, Luo Bin, Focuses and frontiers tendency in neural machine translation research, Journal of Chinese Information Processing, 33, 11, pp. 1-14, (2019)
  • [8] Zhao Yang, Zhou Long, Wang Qian, Ma Cong, Liu Yu-Chen, Wang Yi-Ning, Et al., The study on ethnic-to-Chinese scare-resource neural machine translation, Journal of Jiangxi Normal University (Natural Sciences Edition), 43, 6, pp. 630-637, (2019)
  • [9] Bojar O, Chatterjee R, Federmann C, Graham Y, Haddow B, Huck M, Et al., Findings of the 2016 conference on machine translation, Proceedings of the 1st Conference on Machine Translation, 2, pp. 131-198, (2016)
  • [10] Bojar O, Chatterjee R, Federmann C, Graham Y, Haddow B, Huang S J, Et al., Findings of the 2017 conference on machine translation (WMT17), Proceedings of the 2nd Conference on Machine Translation, pp. 169-214, (2017)