Centralized Database Access: Transformer Framework and LLM/Chatbot Integration-Based Hybrid Model

被引:3
作者
Bratic, Diana [1 ]
Sapina, Marko [1 ]
Jurecic, Denis [1 ]
Grsic, Jana Ziljak [2 ]
机构
[1] Univ Zagreb, Fac Graph Arts, Getaldiceva 2, Zagreb 10000, Croatia
[2] Zagreb Univ Appl Sci, Dept Informat & Comp, Zagreb 10000, Croatia
关键词
centralized database; educational materials; transformer framework; NLP; API implementation; LLM/chatbot;
D O I
10.3390/asi7010017
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper addresses the challenges associated with the centralized storage of educational materials in the context of a fragmented and disparate database. In response to the increasing demands of modern education, efficient and accessible retrieval of materials for educators and students is essential. This paper presents a hybrid model based on the transformer framework and utilizing an API for an existing large language model (LLM)/chatbot. This integration ensures precise responses drawn from a comprehensive educational materials database. The model architecture uses mathematically defined algorithms for precise functions that enable deep text processing through advanced word embedding methods. This approach improves accuracy in natural language processing and ensures both high efficiency and adaptability. Therefore, this paper not only provides a technical solution to a prevalent problem but also highlights the potential for the continued development and integration of emerging technologies in education. The aim is to create a more efficient, transparent, and accessible educational environment. The importance of this research lies in its ability to streamline material access, benefiting the global scientific community and contributing to the continuous advancement of educational technology.
引用
收藏
页数:27
相关论文
共 40 条
  • [11] Ganesan M., 2020, P 2020 INT C SYSTEM, DOI [10.1109/ICSCAN49426.2020.9262366, DOI 10.1109/ICSCAN49426.2020.9262366]
  • [12] An Investigation of Applying Large Language Models to Spoken Language Learning
    Gao, Yingming
    Nuchged, Baorian
    Li, Ya
    Peng, Linkai
    [J]. APPLIED SCIENCES-BASEL, 2024, 14 (01):
  • [13] Ha M., 2023, Advancing Natural Language Processing in Educational Assessment, VVolume 1, P200
  • [14] The Comparison of Word Embedding Techniques in RNNs for Vulnerability Detection
    Hai Ngoc Nguyen
    Teerakanok, Songpon
    Inomata, Atsuo
    Uehara, Tetsutaro
    [J]. ICISSP: PROCEEDINGS OF THE 7TH INTERNATIONAL CONFERENCE ON INFORMATION SYSTEMS SECURITY AND PRIVACY, 2021, : 109 - 120
  • [15] Hiremath G., 2020, Int. J. Adv. Res. Ideas Innov. Technol, V4, P37
  • [16] GPT-GNN: Generative Pre-Training of Graph Neural Networks
    Hu, Ziniu
    Dong, Yuxiao
    Wang, Kuansan
    Chang, Kai-Wei
    Sun, Yizhou
    [J]. KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1857 - 1867
  • [17] Ibna Riza A. Niam, 2023, 2023 IEEE World AI IoT Congress (AIIoT), P0819, DOI 10.1109/AIIoT58121.2023.10174319
  • [18] Ismail F., 2023, Journal of Applied Learning and Teaching, V6, P56, DOI [10.37074/jalt.2023.6.2.34, DOI 10.37074/JALT.2023.6.2.34]
  • [19] Large language models in education: A focus on the complementary relationship between human teachers and ChatGPT
    Jeon, Jaeho
    Lee, Seongyong
    [J]. EDUCATION AND INFORMATION TECHNOLOGIES, 2023, 28 (12) : 15873 - 15892
  • [20] Ji Zhi, 2022, Proceeding of 2021 International Conference on Wireless Communications, Networking and Applications. Lecture Notes in Electrical Engineering, P225, DOI 10.1007/978-981-19-2456-9_24