Efficient Parameterization for Knowledge Graph Embedding Using Hierarchical Attention Network

被引:0
作者
Chen, Zhen-Yu [1 ]
Liu, Feng-Chi [2 ]
Wang, Xin [3 ]
Lee, Cheng-Hsiung [1 ]
Lin, Ching-Sheng [1 ]
机构
[1] Tunghai Univ, Master Program Digital Innovat, Taichung 40704, Taiwan
[2] Feng Chia Univ, Dept Stat, Taichung 40724, Taiwan
[3] SUNY Albany, AI Plus Inst, Coll Integrated Hlth Sci, Albany, NY 12222 USA
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2025年 / 82卷 / 03期
关键词
Knowledge graph embedding; parameter efficiency; representation learning; reserved entity and relation sets; hierarchical attention network;
D O I
10.32604/cmc.2025.061661
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the domain of knowledge graph embedding, conventional approaches typically transform entities and relations into continuous vector spaces. However, parameter efficiency becomes increasingly crucial when dealing with large-scale knowledge graphs that contain vast numbers of entities and relations. In particular, resource-intensive embeddings often lead to increased computational costs, and may limit scalability and adaptability in practical environments, such as in low-resource settings or real-world applications. This paper explores an approach to knowledge graph representation learning that leverages small, reserved entities and relation sets for parameter-efficient embedding. We introduce a hierarchical attention network designed to refine and maximize the representational quality of embeddings by selectively focusing on these reserved sets, thereby reducing model complexity. Empirical assessments validate that our model achieves high performance on the benchmark dataset with fewer parameters and smaller embedding dimensions. The ablation studies further highlight the impact and contribution of each component in the proposed hierarchical attention structure.
引用
收藏
页码:4287 / 4300
页数:14
相关论文
共 50 条
[21]   Knowledge Graph Embedding Based on Semantic Hierarchical Spatial Rotation [J].
Yin, Liangcheng ;
Zhu, Jie ;
Hou, Enshuai ;
Ni, Ma .
2021 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2021, :19-24
[22]   RotatGAT: Learning Knowledge Graph Embedding with Translation Assumptions and Graph Attention Networks [J].
Wang, Guangbin ;
Ding, Yuxin ;
Xie, Zhibin ;
Ma, Yubin ;
Zhou, Zihan ;
Qian, Wen .
2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
[23]   Domain Specific NMT based on Knowledge Graph Embedding and Attention [J].
Yang, Hao ;
Xie, Gengui ;
Qin, Ying ;
Peng, Song .
2019 21ST INTERNATIONAL CONFERENCE ON ADVANCED COMMUNICATION TECHNOLOGY (ICACT): ICT FOR 4TH INDUSTRIAL REVOLUTION, 2019, :516-521
[24]   Efficient Non-Sampling Knowledge Graph Embedding [J].
Li, Zelong ;
Ji, Jianchao ;
Fu, Zuohui ;
Ge, Yingqiang ;
Xu, Shuyuan ;
Chen, Chong ;
Zhang, Yongfeng .
PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, :1727-1736
[25]   Enhanced Knowledge Graph Attention Networks for Efficient Graph Learning [J].
Buschmann, Fernando Vera ;
Du, Zhihui ;
Bader, David .
2024 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE, HPEC, 2024,
[26]   Deep Interest Network Based on Knowledge Graph Embedding [J].
Zhang, Dehai ;
Wang, Haoxing ;
Yang, Xiaobo ;
Ma, Yu ;
Liang, Jiashu ;
Ren, Anquan .
APPLIED SCIENCES-BASEL, 2023, 13 (01)
[27]   Knowledge Graph Embedding Model Based on k -Order Sampling and Graph Attention Networks [J].
Liu, Wenjie ;
Yao, Junfei ;
Chen, Liang .
Computer Engineering and Applications, 2024, 60 (02) :113-120
[28]   Attention-Based Direct Interaction Model for Knowledge Graph Embedding [J].
Zhou, Bo ;
Chen, Yubo ;
Liu, Kang ;
Zhao, Jun .
SEMANTIC TECHNOLOGY, JIST 2019, 2020, 1157 :100-108
[29]   Knowledge graph embedding in a uniform space [J].
Tong, Da ;
Chen, Shudong ;
Ma, Rong ;
Qi, Donglin ;
Yu, Yong .
INTELLIGENT DATA ANALYSIS, 2024, 28 (01) :33-55
[30]   A Lightweight Knowledge Graph Embedding Framework for Efficient Inference and Storage [J].
Wang, Haoyu ;
Wang, Yaqing ;
Lian, Defu ;
Gao, Jing .
PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, :1909-1918