AutoAlign: Fully Automatic and Effective Knowledge Graph Alignment Enabled by Large Language Models

被引:8
|
作者
Zhang, Rui [1 ]
Su, Yixin [2 ]
Trisedya, Bayu Distiawan [3 ]
Zhao, Xiaoyan [4 ]
Yang, Min [5 ]
Cheng, Hong [4 ]
Qi, Jianzhong [2 ]
机构
[1] Tsinghua Univ, Beijing 100190, Peoples R China
[2] Univ Melbourne, Parkville, Vic 3052, Australia
[3] Univ Indonesia, Depok City 16424, West Java, Indonesia
[4] Chinese Univ Hong Kong, Hong Kong, Peoples R China
[5] Chinese Acad Sci, Shenzhen Inst Adv Technol, Shenzhen 518055, Peoples R China
关键词
Learning systems; Knowledge graphs; Vectors; Data models; Task analysis; Attribute embeddings; deep learning; entity alignment; knowledge base; knowledge graph; knowledge graph alignment; large language model; predicate proximity graph; representation learning; WEB;
D O I
10.1109/TKDE.2023.3325484
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The task of entity alignment between knowledge graphs (KGs) aims to identify every pair of entities from two different KGs that represent the same entity. Many machine learning-based methods have been proposed for this task. However, to our best knowledge, existing methods all require manually crafted seed alignments, which are expensive to obtain. In this paper, we propose the first fully automatic alignment method named AutoAlign, which does not require any manually crafted seed alignments. Specifically, for predicate embeddings, AutoAlign constructs a predicate-proximity-graph with the help of large language models to automatically capture the similarity between predicates across two KGs. For entity embeddings, AutoAlign first computes the entity embeddings of each KG independently using TransE, and then shifts the two KGs' entity embeddings into the same vector space by computing the similarity between entities based on their attributes. Thus, both predicate alignment and entity alignment can be done without manually crafted seed alignments. AutoAlign is not only fully automatic, but also highly effective. Experiments using real-world KGs show that AutoAlign improves the performance of entity alignment significantly compared to state-of-the-art methods.
引用
收藏
页码:2357 / 2371
页数:15
相关论文
共 50 条
  • [1] Construction of a knowledge graph for framework material enabled by large language models and its application
    Bai, Xuefeng
    He, Song
    Li, Yi
    Xie, Yabo
    Zhang, Xin
    Du, Wenli
    Li, Jian-Rong
    NPJ COMPUTATIONAL MATERIALS, 2025, 11 (01)
  • [2] Knowledge Graph Treatments for Hallucinating Large Language Models
    Collarana, Diego
    Busch, Moritz
    Lange, Christoph
    ERCIM NEWS, 2024, (136): : 35 - 36
  • [3] MindMap: Knowledge Graph Prompting Sparks Graph of Thoughts in Large Language Models
    Wen, Yilin
    Wang, Zifeng
    Sun, Jimeng
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 10370 - 10388
  • [4] Accelerating knowledge graph and ontology engineering with large language models
    Shimizu, Cogan
    Hitzler, Pascal
    JOURNAL OF WEB SEMANTICS, 2025, 85
  • [5] Connecting AI: Merging Large Language Models and Knowledge Graph
    Jovanovic, Mladan
    Campbell, Mark
    COMPUTER, 2023, 56 (11) : 103 - 108
  • [6] CapsKG: Enabling Continual Knowledge Integration in Language Models for Automatic Knowledge Graph Completion
    Omeliyanenko, Janna
    Zehe, Albin
    Hotho, Andreas
    Schloer, Daniel
    SEMANTIC WEB, ISWC 2023, PART I, 2023, 14265 : 618 - 636
  • [7] Integrating Knowledge Graph Data with Large Language Models for Explainable Inference
    Efrain Quintero-Narvaez, Carlos
    Monroy, Raul
    PROCEEDINGS OF THE 17TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, WSDM 2024, 2024, : 1198 - 1199
  • [8] Knowledge graph construction for intelligent cockpits based on large language models
    Dong, Haomin
    Wang, Wenbin
    Sun, Zhenjiang
    Kang, Ziyi
    Ge, Xiaojun
    Gao, Fei
    Wang, Jixin
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [9] Construction of Legal Knowledge Graph Based on Knowledge-Enhanced Large Language Models
    Li, Jun
    Qian, Lu
    Liu, Peifeng
    Liu, Taoxiong
    INFORMATION, 2024, 15 (11)
  • [10] On the Calibration of Large Language Models and Alignment
    Zhu, Chiwei
    Xu, Benfeng
    Wang, Quan
    Zhang, Yongdong
    Mao, Zhendong
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 9778 - 9795