Bilingual Continuous-Space Language Model Growing for Statistical Machine Translation

被引:21
作者
Wang, Rui [1 ,2 ]
Zhao, Hai [1 ,2 ]
Lu, Bao-Liang [1 ,2 ]
Utiyama, Masao [3 ]
Sumita, Eiichiro [3 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Comp Sci & Engn, Shanghai 200240, Peoples R China
[2] Shanghai Jiao Tong Univ, Key Lab Shanghai Educ Commiss Intelligent Interac, Shanghai 200240, Peoples R China
[3] Natl Inst Informat & Commun Technol, Multilingual Translat Lab, Kyoto 6190289, Japan
基金
中国国家自然科学基金;
关键词
Continuous-space language model; language model growing (LMG); neural network language model; statistical machine translation (SMT);
D O I
10.1109/TASLP.2015.2425220
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Larger-gram language models (LMs) perform better in statistical machine translation (SMT). However, the existing approaches have two main drawbacks for constructing larger LMs: 1) it is not convenient to obtain larger corpora in the same domain as the bilingual parallel corpora in SMT; 2) most of the previous studies focus on monolingual information from the target corpora only, and redundant-grams have not been fully utilized in SMT. Nowadays, continuous-space language model (CSLM), especially neural network language model (NNLM), has been shown great improvement in the estimation accuracies of the probabilities for predicting the target words. However, most of these CSLM and NNLM approaches still consider monolingual information only or require additional corpus. In this paper, we propose a novel neural network based bilingual LM growing method. Compared to the existing approaches, the proposed method enables us to use bilingual parallel corpus for LM growing in SMT. The results show that our new method outperforms the existing approaches on both SMT performance and computational efficiency significantly.
引用
收藏
页码:1209 / 1220
页数:12
相关论文
empty
未找到相关数据