Quantum space-efficient large language models for Prolog query translation

被引:0
作者
Ahmed, Roshan [1 ]
Sridevi, S. [2 ]
机构
[1] Vellore Inst Technol, Sch Comp Sci & Engn, Dept AI & Robot, Chennai 600127, Tamil Nadu, India
[2] Vellore Inst Technol, Sch Comp Sci & Engn, Chennai 600127, Tamil Nadu, India
关键词
Word2Vec; Large language model; Generative AI; Quantum computing; Quantum machine learning; Transfer learning; Prolog;
D O I
10.1007/s11128-024-04559-8
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
As large language models (LLMs) continue to expand in complexity, their size follows an exponential increase following Moore's law. However, implementing such complex tasks with LLMs poses a significant challenge, as classical computers may lack the necessary space to run or store the model parameters. In this context leveraging the principles of hybrid quantum machine learning for language models offers a promising solution to mitigate this issue by reducing storage space for model parameters. Although pure quantum language models have demonstrated success in recent past, they are constrained by limited features and availability. In this research we propose the DeepKet model an approach with a quantum embedding layer, which utilizes the Hilbert space generated by quantum entanglement to store feature vectors, leading to a significant reduction in size. The experimental analysis evaluates the performance of open-source pre-trained models like Microsoft Phi and CodeGen, specifically fine-tuned for generating Prolog code for geo-spatial data retrieval. Furthermore, it investigates the effectiveness of quantum DeepKet embedding layers by comparing them with the total parameter count of traditional models.
引用
收藏
页数:20
相关论文
共 23 条
  • [1] Abdin Marah, 2023, Microsoft Research Blog
  • [2] Quantum Circuits for Fixed Matching Substring Problems
    Cantone, Domenico
    Faro, Simone
    Pavone, Arianna
    Viola, Caterina
    [J]. INTELLIGENT COMPUTING, VOL 3, 2024, 2024, 1018 : 667 - 686
  • [3] Quantum Language Model With Entanglement Embedding for Question Answering
    Chen, Yiwei
    Pan, Yu
    Dong, Daoyi
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (06) : 3467 - 3478
  • [4] Matrix product states and projected entangled pair states: Concepts, symmetries, theorems
    Cirac, J. Ignacio
    Perez-Garcia, David
    Schuch, Norbert
    Verstraete, Frank
    [J]. REVIEWS OF MODERN PHYSICS, 2021, 93 (04)
  • [5] AN INTRODUCTION TO PROLOG-III
    COLMERAUER, A
    [J]. COMMUNICATIONS OF THE ACM, 1990, 33 (07) : 69 - 90
  • [6] Jones C. B., 2002, Proceedings of SIGIR 2002. Twenty-Fifth Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, P387
  • [7] Big Code Search: A Bibliography
    Kim, Kisub
    Ghatpande, Sankalp
    Kim, Dongsun
    Zhou, Xin
    Liu, Kui
    Bissyande, Tegawende F.
    Klein, Jacques
    Le Traon, Yves
    [J]. ACM COMPUTING SURVEYS, 2024, 56 (01)
  • [8] Lieberman M.D., 2007, P 15 ANN ACM INT S A, P1
  • [9] Liu Y., 2023, Meta-Radiology, V1, DOI [DOI 10.1016/J.METRAD.2023.1000172, DOI 10.1016/J.METRAD.2023.100017, 10.1016/j.metrad.2023.100017]
  • [10] Nijkamp Erik, 2023, Codegen: An open large language model for code with multi-turn program synthesis