Efficient Parameterization for Knowledge Graph Embedding Using Hierarchical Attention Network

被引:0
作者
Chen, Zhen-Yu [1 ]
Liu, Feng-Chi [2 ]
Wang, Xin [3 ]
Lee, Cheng-Hsiung [1 ]
Lin, Ching-Sheng [1 ]
机构
[1] Tunghai Univ, Master Program Digital Innovat, Taichung 40704, Taiwan
[2] Feng Chia Univ, Dept Stat, Taichung 40724, Taiwan
[3] SUNY Albany, AI Plus Inst, Coll Integrated Hlth Sci, Albany, NY 12222 USA
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2025年 / 82卷 / 03期
关键词
Knowledge graph embedding; parameter efficiency; representation learning; reserved entity and relation sets; hierarchical attention network;
D O I
10.32604/cmc.2025.061661
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the domain of knowledge graph embedding, conventional approaches typically transform entities and relations into continuous vector spaces. However, parameter efficiency becomes increasingly crucial when dealing with large-scale knowledge graphs that contain vast numbers of entities and relations. In particular, resource-intensive embeddings often lead to increased computational costs, and may limit scalability and adaptability in practical environments, such as in low-resource settings or real-world applications. This paper explores an approach to knowledge graph representation learning that leverages small, reserved entities and relation sets for parameter-efficient embedding. We introduce a hierarchical attention network designed to refine and maximize the representational quality of embeddings by selectively focusing on these reserved sets, thereby reducing model complexity. Empirical assessments validate that our model achieves high performance on the benchmark dataset with fewer parameters and smaller embedding dimensions. The ablation studies further highlight the impact and contribution of each component in the proposed hierarchical attention structure.
引用
收藏
页码:4287 / 4300
页数:14
相关论文
共 50 条
[41]   HTSE: hierarchical time-surface model for temporal knowledge graph embedding [J].
Jin, Langjunqing ;
Zhao, Feng ;
Jin, Hai .
WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (05) :2947-2967
[42]   EARR: Using rules to enhance the embedding of knowledge graph [J].
Li, Jin ;
Xiang, Jinpeng ;
Cheng, Jianhua .
EXPERT SYSTEMS WITH APPLICATIONS, 2023, 232
[43]   Uncertain Knowledge Graph Embedding Using Auxiliary Information [J].
Bahaj, Adil ;
Ghogho, Mounir .
IEEE ACCESS, 2024, 12 :138351-138361
[44]   Attention-based Learning for Multiple Relation Patterns in Knowledge Graph Embedding [J].
Song, Tengwei ;
Luo, Jie .
KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2022, 13368 :658-670
[45]   Explainable Knowledge Reasoning Framework Using Multiple Knowledge Graph Embedding [J].
Kurokawa, Mori .
PROCEEDINGS OF THE 10TH INTERNATIONAL JOINT CONFERENCE ON KNOWLEDGE GRAPHS (IJCKG 2021), 2021, :172-176
[46]   HRS: Hybrid Recommendation System based on Attention Mechanism and Knowledge Graph Embedding [J].
Dong, Chunfang ;
Ju, Xuchan ;
Ma, Yue .
PROCEEDINGS OF 2021 IEEE/WIC/ACM INTERNATIONAL CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY WORKSHOPS AND SPECIAL SESSIONS: (WI-IAT WORKSHOP/SPECIAL SESSION 2021), 2021, :406-413
[47]   Entity-relation aggregation mechanism graph neural network for knowledge graph embedding [J].
Xu, Guoshun ;
Rao, Guozheng ;
Zhang, Li ;
Cong, Qing .
APPLIED INTELLIGENCE, 2025, 55 (01)
[48]   Knowledge graph embedding model with attention-based high-low level features interaction convolutional network [J].
Wang, Jingxiong ;
Zhang, Qi ;
Shi, Fobo ;
Li, Duantengchuan ;
Cai, Yuefeng ;
Wang, Jian ;
Li, Bing ;
Wang, Xiaoguang ;
Zhang, Zhen ;
Zheng, Chao .
INFORMATION PROCESSING & MANAGEMENT, 2023, 60 (04)
[49]   Graph Embedding with Hierarchical Attentive Membership [J].
Lin, Lu ;
Blaser, Ethan ;
Wang, Hongning .
WSDM'22: PROCEEDINGS OF THE FIFTEENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2022, :582-590
[50]   Weighted Knowledge Graph Embedding [J].
Zhang, Zhao ;
Guan, Zhanpeng ;
Zhang, Fuwei ;
Zhuang, Fuzhen ;
An, Zhulin ;
Wang, Fei ;
Xu, Yongjun .
PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, :867-877