Improved neural machine translation using Natural Language Processing (NLP)

被引:0
作者
Sk Hasane Ahammad
Ruth Ramya Kalangi
S. Nagendram
Syed Inthiyaz
P. Poorna Priya
Osama S. Faragallah
Alsharef Mohammad
Mahmoud M. A. Eid
Ahmed Nabih Zaki Rashed
机构
[1] Department of ECE,ECE Department
[2] Koneru Lakshmaiah Education Foundation,Department of Information Technology, College of Computers and Information Technology
[3] Dadi Institute of Engineering and Technology,Department of Electrical Engineering, College of Engineering
[4] Taif University,Electronics and Electrical Communications Engineering Department, Faculty of Electronic Engineering
[5] Taif University,undefined
[6] Menoufia University,undefined
[7] Department of VLSI Microelectronics,undefined
[8] Institute of Electronics and Communication Engineering,undefined
[9] Saveetha School of Engineering,undefined
[10] SIMATS,undefined
来源
Multimedia Tools and Applications | 2024年 / 83卷
关键词
Encoding; Neural Machine Translation; Decoding; NLP; Natural Language Processing; MT;
D O I
暂无
中图分类号
学科分类号
摘要
Deep Learning algorithms have made great significant progress. Many model designs and methodologies have been tested to improve presentation in various fields of Natural Language Processing (NLP). NLP includes the domain of translation through the state-of-art process of machine interpretation. Deep learning refers to the use of neural networks with multiple layers to model complex patterns in data. In the context of NMT, deep learning models can capture the complex relationships between source and target languages, leading to more accurate and fluent translations. The encoder-decoder system is a framework for NMT that uses two neural networks, an encoder and a decoder, to translate input sequences to output sequences. The encoder network processes the input sequence and creates a fixed-length representation of it, while the decoder network generates the output sequence from the encoder's representation. Through the speech/text content process, the computer realizes and resembles the individual intervention known as machine translation. Besides a prominent study area, numerous methods, such as rule-based, quantitative, and even excellent illustration of machine translation supervision, are being established. In machine translation, neural networks have achieved considerable advancements. We reviewed various strategies involved with Encoding-Decoding for the Neural Machine Translation scheme in this research (NMT). Most of the neural machine translation (NMT) prototypes has built at a sequential framework of encoder-decoder that does not employ syntactic information.
引用
收藏
页码:39335 / 39348
页数:13
相关论文
共 15 条
[1]  
Stahlberg F(2020)Neural machine translation: A review J Artif Intell Res 69 343-418
[2]  
Klimova B(2023)Neural machine translation in foreign language teaching and learning: a systematic review Educ Inf Technol 28 663-682
[3]  
Pikhart M(2023)Neural machine translation for low-resource languages: A survey ACM Comput Surv 55 1-37
[4]  
Benites AD(2019)Korean-vietnamese neural machine translation system with korean morphological analysis and word sense disambiguation IEEE Access 7 32602-32616
[5]  
Lehr C(1990)Neural network ensembles IEEE Trans Pattern Anal Mach Intell 12 993-1001
[6]  
Sanchez-Stockhammer C(undefined)undefined undefined undefined undefined-undefined
[7]  
Ranathunga S(undefined)undefined undefined undefined undefined-undefined
[8]  
Lee ESA(undefined)undefined undefined undefined undefined-undefined
[9]  
Prifti Skenduli M(undefined)undefined undefined undefined undefined-undefined
[10]  
Shekhar R(undefined)undefined undefined undefined undefined-undefined