Improved neural machine translation using Natural Language Processing (NLP)

被引:5
作者
Ahammad, Sk Hasane [1 ]
Kalangi, Ruth Ramya [1 ]
Nagendram, S. [1 ]
Inthiyaz, Syed [1 ]
Priya, P. Poorna [2 ]
Faragallah, Osama S. [3 ]
Mohammad, Alsharef [4 ]
Eid, Mahmoud M. A. [4 ]
Rashed, Ahmed Nabih Zaki [5 ,6 ]
机构
[1] Koneru Lakshmaiah Educ Fdn, Dept ECE, Vaddeswaram 522302, India
[2] Dadi Inst Engn & Technol, ECE Dept, Anakapalle, Visakhapatnam, India
[3] Taif Univ, Coll Comp & Informat Technol, Dept Informat Technol, POB 11099, Taif 21944, Saudi Arabia
[4] Taif Univ, Coll Engn, Dept Elect Engn, POB 11099, Taif 21944, Saudi Arabia
[5] Menoufia Univ, Fac Elect Engn, Elect & Elect Commun Engn Dept, Menoufia 32951, Egypt
[6] SIMATS, Inst Elect & Commun Engn, Saveetha Sch Engn, Dept VLSI Microelect, Chennai 602105, Tamilnadu, India
关键词
Encoding; Neural Machine Translation; Decoding; NLP; Natural Language Processing; MT;
D O I
10.1007/s11042-023-17207-7
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep Learning algorithms have made great significant progress. Many model designs and methodologies have been tested to improve presentation in various fields of Natural Language Processing (NLP). NLP includes the domain of translation through the state-of-art process of machine interpretation. Deep learning refers to the use of neural networks with multiple layers to model complex patterns in data. In the context of NMT, deep learning models can capture the complex relationships between source and target languages, leading to more accurate and fluent translations. The encoder-decoder system is a framework for NMT that uses two neural networks, an encoder and a decoder, to translate input sequences to output sequences. The encoder network processes the input sequence and creates a fixed-length representation of it, while the decoder network generates the output sequence from the encoder's representation. Through the speech/text content process, the computer realizes and resembles the individual intervention known as machine translation. Besides a prominent study area, numerous methods, such as rule-based, quantitative, and even excellent illustration of machine translation supervision, are being established. In machine translation, neural networks have achieved considerable advancements. We reviewed various strategies involved with Encoding-Decoding for the Neural Machine Translation scheme in this research (NMT). Most of the neural machine translation (NMT) prototypes has built at a sequential framework of encoder-decoder that does not employ syntactic information.
引用
收藏
页码:39335 / 39348
页数:14
相关论文
共 25 条
[1]  
Ahammad Sk Hasane, Design And Analysis Of A Heavy Vehicle Chassis By Using E-Glass Epoxy & S-2 Glass Materials
[2]  
[Anonymous], 2004, P INT S MACH TRANSL
[3]  
Bahdanau D, 2016, Arxiv, DOI [arXiv:1409.0473, 10.48550/arXiv.1409.0473,1409.0473, DOI 10.48550/ARXIV.1409.0473,1409.0473]
[4]  
Barone AVM, 2017, Arxiv, DOI arXiv:1707.07631
[5]  
Chatterji S, 2009, P ICON 2009 7 INT C
[6]  
CHATTERJI S, 2011, P ICON 2011 9 INT C
[7]   NEURAL NETWORK ENSEMBLES [J].
HANSEN, LK ;
SALAMON, P .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1990, 12 (10) :993-1001
[8]  
He D, 2016, ADV NEUR IN, V29
[9]   Neural machine translation in foreign language teaching and learning: a systematic review [J].
Klimova, Blanka ;
Pikhart, Marcel ;
Benites, Alice Delorme ;
Lehr, Caroline ;
Sanchez-Stockhammer, Christina .
EDUCATION AND INFORMATION TECHNOLOGIES, 2023, 28 (01) :663-682
[10]  
Koehn P, 2003, HLT-NAACL 2003: HUMAN LANGUAGE TECHNOLOGY CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, PROCEEDINGS OF THE MAIN CONFERENCE, P127