STA: An efficient data augmentation method for low-resource neural machine translation

被引:2
作者
Li, Fuxue [1 ,2 ]
Chi, Chuncheng [3 ]
Yan, Hong [2 ]
Liu, Beibei [3 ]
Shao, Mingzhi [3 ]
机构
[1] Northeastern Univ, Sch Comp Sci & Engn, Shenyang, Peoples R China
[2] Yingkou Inst Technol, Coll Elect Engn, Yingkou, Peoples R China
[3] Shenyang Univ Chem Technol, Shenyang, Peoples R China
关键词
Data augmentation; neural machine translation; sentence trunk; mixture; concatenation;
D O I
10.3233/JIFS-230682
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transformer-based neural machine translation (NMT) has achieved state-of-the-art performance in the NMT paradigm. However, it relies on the availability of copious parallel corpora. For low-resource language pairs, the amount of parallel data is insufficient, resulting in poor translation quality. To alleviate this issue, this paper proposes an efficient data augmentation (DA) method named STA. Firstly, the pseudo-parallel sentence pairs are generated by translating sentence trunks with the target-to-source NMT model. Furthermore, two strategies are introduced to merge the original data and pseudo-parallel corpus to augment the training set. Experimental results on simulated and real low-resource translation tasks show that the proposed method improves the translation quality over the strong baseline, and also outperforms other data augmentation methods. Moreover, the STA method can further improve the translation quality when combined with the back-translation method with the extra monolingual data.
引用
收藏
页码:121 / 132
页数:12
相关论文
共 42 条
[1]  
Artetxe M., 2018, 6 INT C LEARNING REP
[2]  
Bahdanau D, 2016, Arxiv, DOI [arXiv:1409.0473, 10.48550/arXiv.1409.0473, DOI 10.48550/ARXIV.1409.0473]
[3]  
Bugliarello E, 2020, 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), P1618
[4]  
Burlot F., 2018, C MACH TRANSL
[5]  
Chen KH, 2020, 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), P358
[6]  
Cheng Q., 2022, P 29 INT C COMP LING, P5148
[7]  
Cotterell R, 2018, Arxiv, DOI arXiv:1806.04402
[8]  
Edunov S, 2018, Arxiv, DOI arXiv:1808.09381
[9]  
Fadaee M, 2018, Arxiv, DOI arXiv:1808.09006
[10]  
Fadaee M, 2017, Arxiv, DOI arXiv:1705.00440