A Review on Large Language Models: Architectures, Applications, Taxonomies, Open Issues and Challenges

被引:185
作者
Raiaan, Mohaimenul Azam Khan [1 ]
Mukta, Md. Saddam Hossain [2 ]
Fatema, Kaniz [3 ]
Fahad, Nur Mohammad [1 ]
Sakib, Sadman [1 ]
Mim, Most Marufatul Jannat [1 ]
Ahmad, Jubaer [1 ]
Ali, Mohammed Eunus [4 ]
Azam, Sami [3 ]
机构
[1] United Int Univ, Dept Comp Sci & Engn, Dhaka 1212, Bangladesh
[2] Lappeenranta Lahti Univ Technol, LUT Sch Engn Sci, Lappeenranta 53850, Finland
[3] Charles Darwin Univ, Fac Sci & Technol, Casuarina, NT 0909, Australia
[4] Bangladesh Univ Engn & Technol BUET, Dept CSE, Dhaka 1000, Bangladesh
关键词
Cognition; Artificial intelligence; Transformers; Training; Taxonomy; Task analysis; Surveys; Natural language processing; Question answering (information retrieval); Information analysis; Linguistics; Large language models (LLM); natural language processing (NLP); artificial intelligence; transformer; pre-trained models; taxonomy; application; GPT-4; BIAS;
D O I
10.1109/ACCESS.2024.3365742
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Large Language Models (LLMs) recently demonstrated extraordinary capability in various natural language processing (NLP) tasks including language translation, text generation, question answering, etc. Moreover, LLMs are new and essential part of computerized language processing, having the ability to understand complex verbal patterns and generate coherent and appropriate replies in a given context. Though this success of LLMs has prompted a substantial increase in research contributions, rapid growth has made it difficult to understand the overall impact of these improvements. Since a plethora of research on LLMs have been appeared within a short time, it is quite impossible to track all of these and get an overview of the current state of research in this area. Consequently, the research community would benefit from a short but thorough review of the recent changes in this area. This article thoroughly overviews LLMs, including their history, architectures, transformers, resources, training methods, applications, impacts, challenges, etc. This paper begins by discussing the fundamental concepts of LLMs with its traditional pipeline of the LLMs training phase. Then the paper provides an overview of the existing works, the history of LLMs, their evolution over time, the architecture of transformers in LLMs, the different resources of LLMs, and the different training methods that have been used to train them. The paper also demonstrates the datasets utilized in the studies. After that, the paper discusses the wide range of applications of LLMs, including biomedical and healthcare, education, social, business, and agriculture. The study also illustrates how LLMs create an impact on society and shape the future of AI and how they can be used to solve real-world problems. Finally, the paper also explores open issues and challenges to deploy LLMs in real-world scenario. Our review paper aims to help practitioners, researchers, and experts thoroughly understand the evolution of LLMs, pre-trained architectures, applications, challenges, and future goals.
引用
收藏
页码:26839 / 26874
页数:36
相关论文
共 187 条
[1]   A Review of Deep Learning Algorithms and Their Applications in Healthcare [J].
Abdel-Jaber, Hussein ;
Devassy, Disha ;
Al Salam, Azhar ;
Hidaytallah, Lamya ;
EL-Amir, Malak .
ALGORITHMS, 2022, 15 (02)
[2]   Cognitive Network Science Reveals Bias in GPT-3, GPT-3.5 Turbo, and GPT-4 Mirroring Math Anxiety in High-School Students [J].
Abramski, Katherine ;
Citraro, Salvatore ;
Lombardi, Luigi ;
Rossetti, Giulio ;
Stella, Massimo .
BIG DATA AND COGNITIVE COMPUTING, 2023, 7 (03)
[3]  
[Anonymous], 2023, About us
[4]   A novel selective learning based transformer encoder architecture with enhanced word representation [J].
Ansar, Wazib ;
Goswami, Saptarsi ;
Chakrabarti, Amlan ;
Chakraborty, Basabi .
APPLIED INTELLIGENCE, 2023, 53 (08) :9424-9443
[5]  
Awasthi A, 2022, Arxiv, DOI arXiv:2210.07313
[6]  
Azunre P., 2021, Transfer learning for natural language processing
[7]   Fast End-to-End Speech Recognition Via Non-Autoregressive Models and Cross-Modal Knowledge Transferring From BERT [J].
Bai, Ye ;
Yi, Jiangyan ;
Tao, Jianhua ;
Tian, Zhengkun ;
Wen, Zhengqi ;
Zhang, Shuai .
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2021, 29 :1897-1911
[8]   Chatbot Interaction with Artificial Intelligence: human data augmentation with T5 and language transformer ensemble for text classification [J].
Bird, Jordan J. ;
Ekart, Aniko ;
Faria, Diego R. .
JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2021, 14 (4) :3129-3144
[9]  
Biswas S., 2023, SSRN Preprint, V2023, P1, DOI [10.2139/ssrn.4403584, DOI 10.2139/SSRN.4403584]
[10]  
Biswas S., 2023, Importance of chat GPT in agriculture: according to chat GPT, DOI [10.2139/ssrn.4405391, DOI 10.2139/SSRN.4405391]