STA: An efficient data augmentation method for low-resource neural machine translation

被引:2
作者
Li, Fuxue [1 ,2 ]
Chi, Chuncheng [3 ]
Yan, Hong [2 ]
Liu, Beibei [3 ]
Shao, Mingzhi [3 ]
机构
[1] Northeastern Univ, Sch Comp Sci & Engn, Shenyang, Peoples R China
[2] Yingkou Inst Technol, Coll Elect Engn, Yingkou, Peoples R China
[3] Shenyang Univ Chem Technol, Shenyang, Peoples R China
关键词
Data augmentation; neural machine translation; sentence trunk; mixture; concatenation;
D O I
10.3233/JIFS-230682
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transformer-based neural machine translation (NMT) has achieved state-of-the-art performance in the NMT paradigm. However, it relies on the availability of copious parallel corpora. For low-resource language pairs, the amount of parallel data is insufficient, resulting in poor translation quality. To alleviate this issue, this paper proposes an efficient data augmentation (DA) method named STA. Firstly, the pseudo-parallel sentence pairs are generated by translating sentence trunks with the target-to-source NMT model. Furthermore, two strategies are introduced to merge the original data and pseudo-parallel corpus to augment the training set. Experimental results on simulated and real low-resource translation tasks show that the proposed method improves the translation quality over the strong baseline, and also outperforms other data augmentation methods. Moreover, the STA method can further improve the translation quality when combined with the back-translation method with the extra monolingual data.
引用
收藏
页码:121 / 132
页数:12
相关论文
共 42 条
[21]   Data augmentation for low-resource languages NMT guided by constrained sampling [J].
Maimaiti, Mieradilijiang ;
Liu, Yang ;
Luan, Huanbo ;
Sun, Maosong .
INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (01) :30-51
[22]   The Stanford CoreNLP Natural Language Processing Toolkit [J].
Manning, Christopher D. ;
Surdeanu, Mihai ;
Bauer, John ;
Finkel, Jenny ;
Bethard, Steven J. ;
McClosky, David .
PROCEEDINGS OF 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: SYSTEM DEMONSTRATIONS, 2014, :55-60
[23]  
Norouzi M., 2016, ADV NEURAL INFORM PR, P29
[24]  
Ott M, 2019, NAACL HLT 2019: THE 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES: PROCEEDINGS OF THE DEMONSTRATIONS SESSION, P48
[25]   BLEU: a method for automatic evaluation of machine translation [J].
Papineni, K ;
Roukos, S ;
Ward, T ;
Zhu, WJ .
40TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, PROCEEDINGS OF THE CONFERENCE, 2002, :311-318
[26]  
Poncelas A., 2018, INVESTIGATING BACKTR
[27]  
Post M., 2018, P 3 C MACHINE TRANSL, P186, DOI DOI 10.18653/V1/W18-6319
[28]  
Ren S, 2018, PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, P56
[29]  
Sennrich R, 2016, Arxiv, DOI arXiv:1606.02891
[30]  
Sennrich R, 2016, Arxiv, DOI arXiv:1511.06709