Local-to-global GCN with knowledge-aware representation for distantly supervised relation extraction

被引:28
作者
Huang, Wenti [1 ,4 ]
Mao, Yiyu [5 ]
Yang, Liu [2 ]
Yang, Zhan [3 ]
Long, Jun [4 ]
机构
[1] Hunan Univ Sci & Technol, Sch Comp Sci & Engn, Xiangtan 411100, Peoples R China
[2] Cent South Univ, Sch Comp Sci & Engn, Changsha 410083, Peoples R China
[3] Cent South Univ, Big Data Inst, Changsha 410083, Peoples R China
[4] Cent South Univ, Network Resources Management & Trust Evaluat Key, Changsha 410083, Peoples R China
[5] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
基金
中国国家自然科学基金;
关键词
Relation extraction; Knowledge graph; Self-attention; Graph convolutional network;
D O I
10.1016/j.knosys.2021.107565
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distantly supervised relation extraction is a prevalent technique for identifying semantic relations between two entities. Most prior models cannot distinguish the local and global information in the long-range dependency among words effectively. The latent semantic information presented in existing knowledge graphs is completely neglected, as the knowledge graph information is simply used as a label to specify the class of relations instead of being treated as a graph. Moreover, previous studies only utilized a selective attention mechanism over sentences to alleviate the impact of noise; they did not consider the implicit interaction between sentences in a sentence bag. In this paper, (1) we propose a knowledge-aware framework to enhance word representations, which can highlight the importance of key words and relation clues; (2) we adopt a piecewise self-attention mechanism to model long-range dependency among words; (3) we design a heterogeneous graph structure for each sentence bag and introduce a heterogeneous graph convolutional network with knowledge attention to aggregate the implicit interaction among sentences from a local-to-global perspective. Experimental results on two popular relation extraction datasets demonstrate that our model obtains more discriminative relation representations and outperforms most of the state-of-the-art models. (C) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 51 条
[1]  
Alicante A, 2011, P INT C REC ADV NAT, P509
[2]  
[Anonymous], 2005, P HUMAN LANGUAGE TEC
[3]  
[Anonymous], 2018, P 2018 C EMPIRICAL M
[4]  
[Anonymous], 2006, P 12 ACM SIGKDD INT, DOI DOI 10.1145/1150402.1150492
[5]  
Bilan I., 2018, ARXIV PREPRINT ARXIV
[6]  
Bollacker K, 2008, SIGMOD, P1247, DOI DOI 10.1145/1376616.1376746
[7]  
Bordes A., 2013, Advances in Neural Information Processing Systems, V26, P2787, DOI DOI 10.5555/2999792.2999923
[8]  
Christopoulou F, 2019, 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019), P4925
[9]  
ChunYang Liu, 2013, Advanced Data Mining and Applications. 9th International Conference, ADMA 2013. Proceedings: LNCS 8347, P231, DOI 10.1007/978-3-642-53917-6_21
[10]   KBQA: Learning Question Answering over QA Corpora and Knowledge Bases [J].
Cui, Wanyun ;
Xiao, Yanghua ;
Wang, Haixun ;
Song, Yangqiu ;
Hwang, Seung-won ;
Wang, Wei .
PROCEEDINGS OF THE VLDB ENDOWMENT, 2017, 10 (05) :565-576