Joint Entity-Relation Extraction via Improved Graph Attention Networks

被引:8
|
作者
Lai, Qinghan [1 ]
Zhou, Zihan [1 ]
Liu, Song [1 ]
机构
[1] Qilu Univ Technol, Shandong Acad Sci, Sch Comp Sci & Technol, Jinan 250353, Peoples R China
来源
SYMMETRY-BASEL | 2020年 / 12卷 / 10期
关键词
named entity recognition; relation extraction; graph attention network; adversarial training;
D O I
10.3390/sym12101746
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Joint named entity recognition and relation extraction is an essential natural language processing task that aims to identify entities and extract the corresponding relations in an end-to-end manner. At present, compared with the named entity recognition task, the relation extraction task performs poorly on complex text. To solve this problem, we proposed a novel joint model named extracting Entity-Relations viaImproved Graph Attention networks (ERIGAT), which enhances the ability of the relation extraction task. In our proposed model, we introduced the graph attention network to extract entities and relations after graph embedding based on constructing symmetry relations. To mitigate the over-smoothing problem of graph convolutional networks, inspired by matrix factorization, we improved the graph attention network by designing a new multi-head attention mechanism and sharing attention parameters. To enhance the model robustness, we adopted the adversarial training to generate adversarial samples for training by adding tiny perturbations. Comparing with typical baseline models, we comprehensively evaluated our model by conducting experiments on an open domain dataset (CoNLL04) and a medical domain dataset (ADE). The experimental results demonstrate the effectiveness of ERIGAT in extracting entity and relation information.
引用
收藏
页码:1 / 17
页数:17
相关论文
共 50 条
  • [31] Dual Interactive Attention Network for Joint Entity and Relation Extraction
    Li, Lishuang
    Wang, Zehao
    Qin, Xueyang
    Lu, Hongbin
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT I, 2022, 13551 : 259 - 271
  • [32] ASPER: Answer Set Programming Enhanced Neural Network Models for Joint Entity-Relation Extraction
    Le, Trung Hoang
    Cao, Huiping
    Cao Son, Tran
    THEORY AND PRACTICE OF LOGIC PROGRAMMING, 2023, 23 (04) : 765 - 781
  • [33] Joint Entity Relation Extraction Model Based on Interactive Attention
    Hao, Xiaofang
    Zhang, Chaoqun
    Li, Xiaoxiang
    Wang, Darui
    Computer Engineering and Applications, 2024, 60 (08) : 156 - 164
  • [34] Joint Entity and Relation Extraction With Set Prediction Networks
    Sui, Dianbo
    Zeng, Xiangrong
    Chen, Yubo
    Liu, Kang
    Zhao, Jun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 12784 - 12795
  • [35] Entity-relation aggregation mechanism graph neural network for knowledge graph embedding
    Xu, Guoshun
    Rao, Guozheng
    Zhang, Li
    Cong, Qing
    APPLIED INTELLIGENCE, 2025, 55 (01)
  • [36] Entity-Relation Extraction as Multi-turn Question Answering
    Li, Xiaoya
    Yin, Fan
    Sun, Zijun
    Li, Xiayu
    Yuan, Arianna
    Chai, Duo
    Zhou, Mingxin
    Li, Jiwei
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1340 - 1350
  • [37] Entity-Relation Extraction as Full Shallow Semantic Dependency Parsing
    Jiang, Shu
    Li, Zuchao
    Zhao, Hai
    Ding, Weiping
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 1088 - 1099
  • [38] Joint entity and relation extraction with position-aware attention and relation embedding
    Chen, Tiantian
    Zhou, Lianke
    Wang, Nianbin
    Chen, Xirui
    APPLIED SOFT COMPUTING, 2022, 119
  • [39] Attention Guided Graph Convolutional Networks for Relation Extraction
    Guo, Zhijiang
    Zhang, Yan
    Lu, Wei
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 241 - 251
  • [40] A hybrid attention mechanism for multi-target entity relation extraction using graph neural networks
    Javeed, Arshad
    MACHINE LEARNING WITH APPLICATIONS, 2023, 11