English Translation Technology Based on Transformer Model

被引:0
|
作者
Lin, Gan [1 ]
机构
[1] Sichuan Univ, Jinjiang Coll, Sch Foreign Languages, Meishan 620860, Sichuan, Peoples R China
来源
PROCEEDINGS OF 2024 INTERNATIONAL CONFERENCE ON MACHINE INTELLIGENCE AND DIGITAL APPLICATIONS, MIDA2024 | 2024年
关键词
English Translation; Transformer Model; Deep Learning; Multi-head Attention; Translation Teaching;
D O I
10.1145/3662739.3672334
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the close international communication, the role of English translation in foreign language teaching is becoming increasingly prominent. The existing translation models have low comprehensibility and provide translation results that are difficult to capture the long-distance dependencies of word order. In order to improve teaching quality, enhance translation level and efficiency, this article combines the Transformer model to study English translation technology. By introducing a multi head attention mechanism into the Transformer model, the source statement is used as input and the target statement is used as output to enhance the model's semantic understanding of the input statement. To verify the translation effect of the model, this article verifies it from two aspects: translation efficiency and quality. From the efficiency test results, compared to the Convolutional Neural Network (CNN) model and the Recurrent Neural Network (RNN) model, the Transformer model's final Speed value decreased by 11.1% and 10.8%, respectively. From this result, it can be seen that English translation technology relying on the Transformer model is beneficial for improving the intelligence of translation results and promoting the improvement of English teaching level.
引用
收藏
页码:655 / 659
页数:5
相关论文
共 50 条