RMAN: Relational multi-head attention neural network for joint extraction of entities and relations

被引:25
作者
Lai, Taiqu [1 ]
Cheng, Lianglun [2 ]
Wang, Depei [1 ]
Ye, Haiming [2 ]
Zhang, Weiwen [2 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou, Peoples R China
[2] Guangdong Univ Technol, Sch Comp, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Joint extraction of entities and relations; Relation feature; Multi-head attention; Sequence annotation; INFORMATION;
D O I
10.1007/s10489-021-02600-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The task of extracting entities and relations has evolved from distributed extraction to joint extraction. The joint model overcomes the disadvantages of distributed extraction method and strengthens the information interaction between entities and relations. However, the existing methods of the joint model rarely pay attention to the semantic information between words, which have limitations in solving the problem of overlapping relations. In this paper, we propose an RMAN model for joint extraction of entities and relations, which includes multi-feature fusion encoder sentence representation and decoder sequence annotation. We first add a multi-head attention layer after Bi-LSTM to obtain sentence representations, and leverage the attention mechanism to capture relation-based sentence representations. Then, we perform sequence annotation on the sentence representation to obtain entity pairs. Experiments on NYT-single, NYT-multi and WebNLG datasets demonstrate that our model can efficiently extract overlapping triples, which outperforms other baselines.
引用
收藏
页码:3132 / 3142
页数:11
相关论文
共 50 条
  • [31] Improving Multi-head Attention with Capsule Networks
    Gu, Shuhao
    Feng, Yang
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 314 - 326
  • [32] Classification of Heads in Multi-head Attention Mechanisms
    Huang, Feihu
    Jiang, Min
    Liu, Fang
    Xu, Dian
    Fan, Zimeng
    Wang, Yonghao
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2022, PT III, 2022, 13370 : 681 - 692
  • [33] Diversifying Multi-Head Attention in the Transformer Model
    Ampazis, Nicholas
    Sakketou, Flora
    MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2024, 6 (04): : 2618 - 2638
  • [34] A new interest extraction method based on multi-head attention mechanism for CTR prediction
    Haifeng Yang
    Linjing Yao
    Jianghui Cai
    Yupeng Wang
    Xujun Zhao
    Knowledge and Information Systems, 2023, 65 : 3337 - 3352
  • [35] A new interest extraction method based on multi-head attention mechanism for CTR prediction
    Yang, Haifeng
    Yao, Linjing
    Cai, Jianghui
    Wang, Yupeng
    Zhao, Xujun
    KNOWLEDGE AND INFORMATION SYSTEMS, 2023, 65 (08) : 3337 - 3352
  • [36] Multi-head Attention and Graph Convolutional Networks with Regularized Dropout for Biomedical Relation Extraction
    Huang, Mian
    Wang, Jian
    Lin, Hongfei
    Yang, Zhihao
    HEALTH INFORMATION PROCESSING, CHIP 2023, 2023, 1993 : 98 - 111
  • [37] Entity and relation collaborative extraction approach based on multi-head attention and gated mechanism
    Zhao, Wei
    Zhao, Shan
    Chen, Shuhui
    Weng, Tien-Hsiung
    Kang, WenJie
    CONNECTION SCIENCE, 2022, 34 (01) : 670 - 686
  • [38] Research on power generation prediction of hydropower in river basin based on multi-head attention graph convolutional neural network
    Chen, Zhiliang
    Wang, Juan
    Wei, Miao
    JOURNAL OF COMPUTATIONAL METHODS IN SCIENCES AND ENGINEERING, 2024, 24 (02) : 797 - 811
  • [39] Acoustic Scene Analysis with Multi-head Attention Networks
    Wang, Weimin
    Wang, Weiran
    Sun, Ming
    Wang, Chao
    INTERSPEECH 2020, 2020, : 1191 - 1195
  • [40] Enhancing Attention Models via Multi-head Collaboration
    Wang, Huadong
    Tu, Mei
    2020 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP 2020), 2020, : 19 - 23