Efficient Parameterization for Knowledge Graph Embedding Using Hierarchical Attention Network

被引:0
作者
Chen, Zhen-Yu [1 ]
Liu, Feng-Chi [2 ]
Wang, Xin [3 ]
Lee, Cheng-Hsiung [1 ]
Lin, Ching-Sheng [1 ]
机构
[1] Tunghai Univ, Master Program Digital Innovat, Taichung 40704, Taiwan
[2] Feng Chia Univ, Dept Stat, Taichung 40724, Taiwan
[3] SUNY Albany, AI Plus Inst, Coll Integrated Hlth Sci, Albany, NY 12222 USA
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2025年 / 82卷 / 03期
关键词
Knowledge graph embedding; parameter efficiency; representation learning; reserved entity and relation sets; hierarchical attention network;
D O I
10.32604/cmc.2025.061661
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the domain of knowledge graph embedding, conventional approaches typically transform entities and relations into continuous vector spaces. However, parameter efficiency becomes increasingly crucial when dealing with large-scale knowledge graphs that contain vast numbers of entities and relations. In particular, resource-intensive embeddings often lead to increased computational costs, and may limit scalability and adaptability in practical environments, such as in low-resource settings or real-world applications. This paper explores an approach to knowledge graph representation learning that leverages small, reserved entities and relation sets for parameter-efficient embedding. We introduce a hierarchical attention network designed to refine and maximize the representational quality of embeddings by selectively focusing on these reserved sets, thereby reducing model complexity. Empirical assessments validate that our model achieves high performance on the benchmark dataset with fewer parameters and smaller embedding dimensions. The ablation studies further highlight the impact and contribution of each component in the proposed hierarchical attention structure.
引用
收藏
页码:4287 / 4300
页数:14
相关论文
共 50 条
[31]   Relation correlations-aware graph convolutional network with text-enhanced for knowledge graph embedding [J].
Yu, Hong ;
Tang, Jinxuan ;
Peng, Zhihan ;
Wang, Ye .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (10) :4659-4668
[32]   Efficient Knowledge Graph Embedding Training Framework with Multiple GPUs [J].
Sun, Ding ;
Huang, Zhen ;
Li, Dongsheng ;
Guo, Min .
TSINGHUA SCIENCE AND TECHNOLOGY, 2023, 28 (01) :167-175
[33]   RAGAT: Relation Aware Graph Attention Network for Knowledge Graph Completion [J].
Liu, Xiyang ;
Tan, Huobin ;
Chen, Qinghong ;
Lin, Guangyan .
IEEE ACCESS, 2021, 9 :20840-20849
[34]   Relational Representation Augmented Graph Attention Network for Knowledge Graph Completion [J].
Aili, Elyar ;
Yilahun, Hankiz ;
Imam, Seyyare ;
Hamdulla, Askar .
PROCEEDINGS OF 2024 8TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND ARTIFICIAL INTELLIGENCE, CSAI 2024, 2024, :449-455
[35]   Multiview feature augmented neural network for knowledge graph embedding [J].
Jiang, Dan ;
Wang, Ronggui ;
Xue, Lixia ;
Yang, Juan .
KNOWLEDGE-BASED SYSTEMS, 2022, 255
[36]   A Triple-Branch Neural Network for Knowledge Graph Embedding [J].
Han, Xiao ;
Zhang, Chunhong ;
Sun, Tingting ;
Ji, Yang ;
Hu, Zheng .
IEEE ACCESS, 2018, 6 :76606-76615
[37]   TARGAT: A Time-Aware Relational Graph Attention Model for Temporal Knowledge Graph Embedding [J].
Xie, Zhiwen ;
Zhu, Runjie ;
Liu, Jin ;
Zhou, Guangyou ;
Huang, Jimmy Xiangji .
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 :2246-2258
[38]   Knowledge-aware hierarchical attention network for recommendation [J].
Fang, Min ;
Liu, Lu ;
Ye, Yuxin ;
Zhu, Beibei ;
Han, Jiayu ;
Peng, Tao .
JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 43 (06) :7545-7557
[39]   HTSE: hierarchical time-surface model for temporal knowledge graph embedding [J].
Langjunqing Jin ;
Feng Zhao ;
Hai Jin .
World Wide Web, 2023, 26 :2947-2967
[40]   Hierarchical-aware relation rotational knowledge graph embedding for link prediction [J].
Wang, Shensi ;
Fu, Kun ;
Sun, Xian ;
Zhang, Zequn ;
Li, Shuchao ;
Jin, Li .
NEUROCOMPUTING, 2021, 458 (458) :259-270