KGTuner: Efficient Hyper-parameter Search for Knowledge Graph Learning

被引:0
作者
Zhang, Yongqi [1 ]
Zhou, Zhanke [1 ,2 ]
Yao, Quanming [3 ]
Li, Yong [3 ]
机构
[1] 4Paradigm Inc, Beijing, Peoples R China
[2] Hong Kong Baptist Univ, Hong Kong, Peoples R China
[3] Tsinghua Univ, Dept Elect Engn, Beijing, Peoples R China
来源
PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS) | 2022年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While hyper-parameters (HPs) are important for knowledge graph (KG) learning, existing methods fail to search them efficiently. To solve this problem, we first analyze the properties of different HPs and measure the transfer ability from small subgraph to the full graph. Based on the analysis, we propose an efficient two-stage search algorithm KGTuner, which efficiently explores HP configurations on small subgraph at the first stage and transfers the top-performed configurations for fine-tuning on the large full graph at the second stage. Experiments show that our method can consistently find better HPs than the baseline algorithms within the same time budget, which achieves 9.1% average relative improvement for four embedding models on the large-scale KGs in open graph benchmark.
引用
收藏
页码:2715 / 2735
页数:21
相关论文
共 56 条
[1]  
Ali Mehdi, 2020, TECHNICAL REPORT
[2]  
[Anonymous], NeurIPS
[3]  
[Anonymous], 2019, ICML
[4]  
[Anonymous], 2020, ACL
[5]  
[Anonymous], 2017, NIPS
[6]  
[Anonymous], 2018, ICML
[7]  
Balazevic Ivana, 2019, EMNLP
[8]   Consciousness is not a property of states: A reply to Wilberg [J].
Berger, Jacob .
PHILOSOPHICAL PSYCHOLOGY, 2014, 27 (06) :829-842
[9]  
Bergstra J., 2011, P 2011 ANN C NEURAL, V24, DOI DOI 10.5555/2986459.2986743
[10]  
Bergstra James, 2012, JMLR, V13