A comparative study of neural machine translation models for Turkish language

被引:0
作者
Ozdemir, Ozgur [1 ]
Akin, Emre Salih [2 ]
Velioglu, Riza [3 ]
Dalyan, Tugba [1 ]
机构
[1] Istanbul Bilgi Univ, Comp Engn Dept, Istanbul, Turkey
[2] Univ Hertfordshire, Dept Comp Sci, Hatfield, Herts, England
[3] Bielefeld Univ, Fac Technol, Bielefeld, Germany
关键词
Neural machine translation; Gumbel Softmax; sequence to sequence; transformer;
D O I
10.3233/JIFS-211453
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Machine translation (MT) is an important challenge in the fields of Computational Linguistics. In this study, we conducted neural machine translation (NMT) experiments on two different architectures. First, Sequence to Sequence (Seq2Seq) architecture along with a variation that utilizes attention mechanism is performed on translation task. Second, an architecture that is fully based on the self-attention mechanism, namely Transformer, is employed to perform a comprehensive comparison. Besides, the contribution of employing Byte Pair Encoding (BPE) and Gumbel Softmax distributions are examined for both architectures. The experiments are conducted on two different datasets: TED Talks that is one of the popular benchmark datasets for NMT especially among morphologically rich languages like Turkish and WMT18 News dataset that is provided by The Third Conference on Machine Translation (WMT) for shared tasks on various aspects of machine translation. The evaluation of Turkish-to-English translations' results demonstrate that the Transformer model with combination of BPE and Gumbel Softmax achieved 22.4 BLEU score on TED Talks and 38.7 BLUE score on WMT18 News dataset. The empirical results support that using Gumbel Softmax distribution improves the quality of translations for both architectures.
引用
收藏
页码:2103 / 2113
页数:11
相关论文
共 31 条
[1]  
[Anonymous], 2014, P 2014 C EMP METH NA, DOI DOI 10.3115/V1/D14-1002
[2]  
[Anonymous], 2014, P 2014 C EMP METH NA
[3]  
Ap SC, 2014, ADV NEURAL INFORM PR, P1853
[4]  
Ataman Duygu, 2017, Prague Bulletin of Mathematical Linguistics, P331, DOI 10.1515/pralin-2017-0031
[5]  
Bahdanau D, 2016, Arxiv, DOI arXiv:1409.0473
[6]  
Bojar Ondrej., 2018, P 3 C MACHINE TRANSL, P272, DOI 10.18653/v1/W18-6401
[7]  
Cho K., 2014, PROC 8 WORKSHOP SYNT, P103, DOI DOI 10.3115/V1/W14-4012
[8]  
Currey A., 2017, P 2 C MACH TRANSL, P148
[9]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[10]  
Devlin J, 2014, PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, P1370