Local-to-global GCN with knowledge-aware representation for distantly supervised relation extraction

被引:28
作者
Huang, Wenti [1 ,4 ]
Mao, Yiyu [5 ]
Yang, Liu [2 ]
Yang, Zhan [3 ]
Long, Jun [4 ]
机构
[1] Hunan Univ Sci & Technol, Sch Comp Sci & Engn, Xiangtan 411100, Peoples R China
[2] Cent South Univ, Sch Comp Sci & Engn, Changsha 410083, Peoples R China
[3] Cent South Univ, Big Data Inst, Changsha 410083, Peoples R China
[4] Cent South Univ, Network Resources Management & Trust Evaluat Key, Changsha 410083, Peoples R China
[5] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
基金
中国国家自然科学基金;
关键词
Relation extraction; Knowledge graph; Self-attention; Graph convolutional network;
D O I
10.1016/j.knosys.2021.107565
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distantly supervised relation extraction is a prevalent technique for identifying semantic relations between two entities. Most prior models cannot distinguish the local and global information in the long-range dependency among words effectively. The latent semantic information presented in existing knowledge graphs is completely neglected, as the knowledge graph information is simply used as a label to specify the class of relations instead of being treated as a graph. Moreover, previous studies only utilized a selective attention mechanism over sentences to alleviate the impact of noise; they did not consider the implicit interaction between sentences in a sentence bag. In this paper, (1) we propose a knowledge-aware framework to enhance word representations, which can highlight the importance of key words and relation clues; (2) we adopt a piecewise self-attention mechanism to model long-range dependency among words; (3) we design a heterogeneous graph structure for each sentence bag and introduce a heterogeneous graph convolutional network with knowledge attention to aggregate the implicit interaction among sentences from a local-to-global perspective. Experimental results on two popular relation extraction datasets demonstrate that our model obtains more discriminative relation representations and outperforms most of the state-of-the-art models. (C) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 51 条
[51]   Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification [J].
Zhou, Peng ;
Shi, Wei ;
Tian, Jun ;
Qi, Zhenyu ;
Li, Bingchen ;
Hao, Hongwei ;
Xu, Bo .
PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2016), VOL 2, 2016, :207-212