Efficient Parameterization for Knowledge Graph Embedding Using Hierarchical Attention Network

被引:0
|
作者
Chen, Zhen-Yu [1 ]
Liu, Feng-Chi [2 ]
Wang, Xin [3 ]
Lee, Cheng-Hsiung [1 ]
Lin, Ching-Sheng [1 ]
机构
[1] Tunghai Univ, Master Program Digital Innovat, Taichung 40704, Taiwan
[2] Feng Chia Univ, Dept Stat, Taichung 40724, Taiwan
[3] SUNY Albany, AI Plus Inst, Coll Integrated Hlth Sci, Albany, NY 12222 USA
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2025年 / 82卷 / 03期
关键词
Knowledge graph embedding; parameter efficiency; representation learning; reserved entity and relation sets; hierarchical attention network;
D O I
10.32604/cmc.2025.061661
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the domain of knowledge graph embedding, conventional approaches typically transform entities and relations into continuous vector spaces. However, parameter efficiency becomes increasingly crucial when dealing with large-scale knowledge graphs that contain vast numbers of entities and relations. In particular, resource-intensive embeddings often lead to increased computational costs, and may limit scalability and adaptability in practical environments, such as in low-resource settings or real-world applications. This paper explores an approach to knowledge graph representation learning that leverages small, reserved entities and relation sets for parameter-efficient embedding. We introduce a hierarchical attention network designed to refine and maximize the representational quality of embeddings by selectively focusing on these reserved sets, thereby reducing model complexity. Empirical assessments validate that our model achieves high performance on the benchmark dataset with fewer parameters and smaller embedding dimensions. The ablation studies further highlight the impact and contribution of each component in the proposed hierarchical attention structure.
引用
收藏
页码:4287 / 4300
页数:14
相关论文
共 50 条
  • [1] DisenKGAT: Knowledge Graph Embedding with Disentangled Graph Attention Network
    Wu, Junkang
    Shi, Wentao
    Cao, Xuezhi
    Chen, Jiawei
    Lei, Wenqiang
    Zhang, Fuzheng
    Wu, Wei
    He, Xiangnan
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 2140 - 2149
  • [2] Multisource hierarchical neural network for knowledge graph embedding
    Jiang, Dan
    Wang, Ronggui
    Xue, Lixia
    Yang, Juan
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 237
  • [3] Learning knowledge graph embedding with a dual-attention embedding network
    Fang, Haichuan
    Wang, Youwei
    Tian, Zhen
    Ye, Yangdong
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 212
  • [4] Hyperbolic hierarchical graph attention network for knowledge graph completion
    许浩
    CHEN Shudong
    QI Donglin
    TONG Da
    YU Yong
    CHEN Shuai
    High Technology Letters, 2024, 30 (03) : 271 - 279
  • [5] Hierarchical Perceptual Graph Attention Network for Knowledge Graph Completion
    Han, Wenhao
    Liu, Xuemei
    Zhang, Jianhao
    Li, Hairui
    ELECTRONICS, 2024, 13 (04)
  • [6] Hyperbolic hierarchical graph attention network for knowledge graph completion
    Xu, Hao
    Chen, Shudong
    Qi, Donglin
    Tong, Da
    Yu, Yong
    Chen, Shuai
    High Technology Letters, 2024, 30 (03) : 271 - 279
  • [7] Hierarchical graph attention network for temporal knowledge graph reasoning
    Shao, Pengpeng
    He, Jiayi
    Li, Guanjun
    Zhang, Dawei
    Tao, Jianhua
    NEUROCOMPUTING, 2023, 550
  • [8] Temporal network embedding using graph attention network
    Mohan, Anuraj
    Pramod, K., V
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (01) : 13 - 27
  • [9] Temporal network embedding using graph attention network
    Anuraj Mohan
    K V Pramod
    Complex & Intelligent Systems, 2022, 8 : 13 - 27
  • [10] A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion
    Shang, Bin
    Zhao, Yinliang
    Liu, Jun
    Liu, Yifan
    Wang, Chenxin
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (20): : 15005 - 15018