A review on synergizing knowledge graphs and large language models

被引:0
作者
Yang, Zhenyao [1 ]
Yuan, Sha [1 ]
Shao, Zhou [2 ]
Li, Wenfa [1 ]
Liu, Runzhou [3 ]
机构
[1] Univ Sci & Technol Beijing, Beijing, Peoples R China
[2] Zhipu AI, Beijing, Peoples R China
[3] Columbia Univ, New York, NY USA
关键词
Large language models (LLMs); Knowledge graphs (KGs); Knowledge generation; GLM;
D O I
10.1007/s00607-025-01499-8
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
This paper examines the integration of large language models (LLMs) with knowledge graphs (KGs) through a systematic four-layer framework that includes data, model, technology, and application dimensions. We analyze the capabilities and limitations of LLMs in natural language processing along with the strengths and challenges of KGs in knowledge representation. We address fundamental weaknesses in each approach and identify complementary integration methods. Our analysis reveals that LLMs excel at contextual understanding and generation but struggle with factual consistency and reasoning transparency. In contrast, KGs provide structured and verifiable knowledge but lack adaptability to unstructured inputs. We review integration strategies, including knowledge injection techniques, retrieval-augmented generation, and neuro-symbolic approaches. The combined methods demonstrate significant performance improvements. Through the case study of the GLM architecture, we demonstrate how the integration of KGs and LLMs improves accuracy, interpretability, and factual grounding in specialized domains and also shows substantial performance improvements in knowledge-intensive tasks (15-20% on MedQA and 14-17% on the medical MMLU benchmarks). The resulting hybrid systems offer concrete advantages in critical applications requiring precision and adaptability, including healthcare diagnostics, financial compliance, and educational technology. Lightweight knowledge representation, adaptive update mechanisms, and unified cross-modal frameworks are promising research directions to advance KG-LLM integration.
引用
收藏
页数:25
相关论文
共 72 条
[1]   Explicit Knowledge Integration for Knowledge-Aware Visual Question Answering about Named Entities [J].
Adjali, Omar ;
Grimal, Paul ;
Ferret, Olivier ;
Ghannay, Sahar ;
Le Borgne, Herve .
PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023, 2023, :29-38
[2]  
Adolphs L., 2022, arXiv
[3]   Improving Access to Scientific Literature with Knowledge Graphs [J].
Auer, Soren ;
Oelen, Allard ;
Haris, Muhammad ;
Stocker, Markus ;
D'Souza, Jennifer ;
Farfar, Kheir Eddine ;
Vogt, Lars ;
Prinz, Manuel ;
Wiens, Vitalis ;
Jaradeh, Mohamad Yaser .
BIBLIOTHEK FORSCHUNG UND PRAXIS, 2020, 44 (03) :516-529
[4]  
Bai GJ, 2024, Arxiv, DOI [arXiv:2401.00625, DOI 10.48550/ARXIV.2401.00625]
[5]  
Baktash JA, 2023, Arxiv, DOI arXiv:2305.03195
[6]  
Bao W., 2024, Inf Fusion, V8
[7]  
Bi X, 2024, Arxiv, DOI [arXiv:2401.02954, DOI 10.48550/ARXIV.2304.05332, DOI 10.48550/ARXIV.2401.02954]
[8]  
Brown TB, 2020, ADV NEUR IN, V33
[9]   A review: Knowledge reasoning over knowledge graph [J].
Chen, Xiaojun ;
Jia, Shengbin ;
Xiang, Yang .
EXPERT SYSTEMS WITH APPLICATIONS, 2020, 141
[10]   Evolution and Prospects of Foundation Models: From Large Language Models to Large Multimodal Models [J].
Chen, Zheyi ;
Xu, Liuchang ;
Zheng, Hongting ;
Chen, Luyao ;
Tolba, Amr ;
Zhao, Liang ;
Yu, Keping ;
Feng, Hailin .
CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 80 (02) :1753-1808