RMAN: Relational multi-head attention neural network for joint extraction of entities and relations

被引:25
作者
Lai, Taiqu [1 ]
Cheng, Lianglun [2 ]
Wang, Depei [1 ]
Ye, Haiming [2 ]
Zhang, Weiwen [2 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou, Peoples R China
[2] Guangdong Univ Technol, Sch Comp, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Joint extraction of entities and relations; Relation feature; Multi-head attention; Sequence annotation; INFORMATION;
D O I
10.1007/s10489-021-02600-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The task of extracting entities and relations has evolved from distributed extraction to joint extraction. The joint model overcomes the disadvantages of distributed extraction method and strengthens the information interaction between entities and relations. However, the existing methods of the joint model rarely pay attention to the semantic information between words, which have limitations in solving the problem of overlapping relations. In this paper, we propose an RMAN model for joint extraction of entities and relations, which includes multi-feature fusion encoder sentence representation and decoder sequence annotation. We first add a multi-head attention layer after Bi-LSTM to obtain sentence representations, and leverage the attention mechanism to capture relation-based sentence representations. Then, we perform sequence annotation on the sentence representation to obtain entity pairs. Experiments on NYT-single, NYT-multi and WebNLG datasets demonstrate that our model can efficiently extract overlapping triples, which outperforms other baselines.
引用
收藏
页码:3132 / 3142
页数:11
相关论文
共 50 条
  • [41] Memory network with hierarchical multi-head attention for aspect-based sentiment analysis
    Chen, Yuzhong
    Zhuang, Tianhao
    Guo, Kun
    APPLIED INTELLIGENCE, 2021, 51 (07) : 4287 - 4304
  • [42] Join-Chain Network: A Logical Reasoning View of the Multi-head Attention in Transformer
    Zhang, Jianyi
    Chen, Yiran
    Chen, Jianshu
    2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW, 2022, : 947 - 957
  • [43] Enhancing Attention Models via Multi-head Collaboration
    Wang, Huadong
    Tu, Mei
    2020 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP 2020), 2020, : 19 - 23
  • [44] Bidirectional GRU with Multi-Head Attention for Chinese NER
    Yan, Shuo
    Chai, Jianping
    Wu, Liyun
    PROCEEDINGS OF 2020 IEEE 5TH INFORMATION TECHNOLOGY AND MECHATRONICS ENGINEERING CONFERENCE (ITOEC 2020), 2020, : 1160 - 1164
  • [45] Memory network with hierarchical multi-head attention for aspect-based sentiment analysis
    Yuzhong Chen
    Tianhao Zhuang
    Kun Guo
    Applied Intelligence, 2021, 51 : 4287 - 4304
  • [46] Recall Mechanism and Multi-Head Attention for Numerical Reasoning
    Lai, Linjia
    Tan, Tien-Ping
    Zeng, Bocan
    APPLIED SCIENCES-BASEL, 2025, 15 (07):
  • [47] Sequence to sequence learning for joint extraction of entities and relations
    Liang, Zeyu
    Du, Junping
    NEUROCOMPUTING, 2022, 501 : 480 - 488
  • [48] Word embedding factor based multi-head attention
    Li, Zhengren
    Zhao, Yumeng
    Zhang, Xiaohang
    Han, Huawei
    Huang, Cui
    ARTIFICIAL INTELLIGENCE REVIEW, 2025, 58 (04)
  • [49] Multi-head Attention Networks for Nonintrusive Load Monitoring
    Lin, Nan
    Zhou, Binggui
    Yang, Guanghua
    Ma, Shaodan
    2020 IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, COMMUNICATIONS AND COMPUTING (IEEE ICSPCC 2020), 2020,
  • [50] ORTHOGONALITY CONSTRAINED MULTI-HEAD ATTENTION FOR KEYWORD SPOTTING
    Lee, Mingu
    Lee, Jinkyu
    Jang, Hye Jin
    Kim, Byeonggeun
    Chang, Wonil
    Hwang, Kyuwoong
    2019 IEEE AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING WORKSHOP (ASRU 2019), 2019, : 86 - 92