Efficient English Translation Method and Analysis Based on the Hybrid Neural Network

被引:1
作者
Wang, Chuncheng [1 ]
机构
[1] Tongling Univ, Tongling 244061, Peoples R China
关键词
Long short-term memory;
D O I
10.1155/2021/9985251
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Neural machine translation has been widely concerned in recent years. The traditional sequential neural network framework of English translation has obvious disadvantages because of its poor ability to capture long-distance information, and the current improved framework, such as the recurrent neural network, still cannot solve this problem very well. In this paper, we propose a hybrid neural network that combines the convolutional neural network (CNN) and long short-term memory (LSTM) and introduce the attention mechanism based on the encoder-decoder structure to improve the translation accuracy, especially for long sentences. In the experiment, this model is implemented based on TensorFlow, and the results show that the BLEU value of the proposed method is obviously improved compared with the traditional machine learning model, which proves the effectiveness of our method in English-Chinese translation.
引用
收藏
页数:10
相关论文
共 30 条
[1]  
[Anonymous], 2016, P 2016 C EMP METH NA
[2]  
Bo Y., 2018, COMPUTER MEASUREMENT, V41
[3]  
Brea J., 2011, P PART ADV NEUR INF
[4]  
Cho K., 2014, PROC C EMPIRICAL MET
[5]  
Chung C.T., 2016, ITERATIVE DEEP LEARN
[6]  
Deng L, 2010, 11TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2010 (INTERSPEECH 2010), VOLS 3 AND 4, P1692
[7]  
Druck G., 2011, P 49 ANN M ASS COMP
[8]  
Gundapu S., 2021, MULTICHANNEL LSTM CN
[9]   TensorD: A tensor decomposition library in TensorFlow [J].
Hao, Liyang ;
Liang, Siqi ;
Ye, Jinmian ;
Xu, Zenglin .
NEUROCOMPUTING, 2018, 318 :196-200
[10]  
Huang Z., 2010, P C EMP METH NAT LAN