Infusing Dependency Syntax Information into a Transformer Model for Document-Level Relation Extraction from Biomedical Literature

被引:0
|
作者
Yang, Ming [1 ]
Zhang, Yijia [1 ]
Liu, Da [1 ]
Du, Wei [1 ]
Di, Yide [1 ]
Lin, Hongfei [2 ]
机构
[1] Dalian Maritime Univ, Sch Informat Sci & Technol, Dalian 116024, Liaoning, Peoples R China
[2] Dalian Univ Technol, Sch Comp Sci & Technol, Dalian 116023, Lioaoning, Peoples R China
来源
HEALTH INFORMATION PROCESSING, CHIP 2022 | 2023年 / 1772卷
关键词
Document-level relation extraction; Dependency syntax information; Transformer model; Attention mechanism; Biomedical literature;
D O I
10.1007/978-981-19-9865-2_3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In biomedical domain, document-level relation extraction is a challenging task that offers a new and more effective approach for long and complex text mining. Studies have shown that the Transformer models the dependencies of any two tokens without regard to their syntax-level dependency in the sequence. In this work, we propose a Dependency Syntax Transformer Model, i.e., the DSTM model, to improve the Transformer's ability in long-range modeling dependencies. Three methods are proposed for introducing dependency syntax information into the Transformer to enhance the attention of tokens with dependencies in a sentence. The dependency syntax Transformer model improves the Transformer's ability to handle long text in document-level relation extraction. Our experimental results on the document-level relation extraction dataset CDR in the biomedical field prove the validity of the DSTM model, and the experimental results on the generic domain dataset DocRED prove the universality.
引用
收藏
页码:37 / 52
页数:16
相关论文
共 50 条
  • [41] NA-Aware Machine Reading Comprehension for Document-Level Relation Extraction
    Zhang, Zhenyu
    Yu, Bowen
    Shu, Xiaobo
    Liu, Tingwen
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT III, 2021, 12977 : 580 - 595
  • [42] Refining ChatGPT for Document-Level Relation Extraction: A Multi-dimensional Prompting Approach
    Zhu, Weiran
    Wang, Xinzhi
    Chen, Xue
    Luo, Xiangfeng
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT III, ICIC 2024, 2024, 14877 : 190 - 201
  • [43] SaGCN: Structure-Aware Graph Convolution Network for Document-Level Relation Extraction
    Yang, Shuangji
    Zhang, Taolin
    Su, Danning
    Hu, Nan
    Nong, Wei
    He, Xiaofeng
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2021, PT III, 2021, 12714 : 377 - 389
  • [44] Enhanced graph convolutional network based on node importance for document-level relation extraction
    Sun, Qi
    Zhang, Kun
    Huang, Kun
    Li, Xun
    Zhang, Ting
    Xu, Tiancheng
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (18) : 15429 - 15439
  • [45] Graph neural networks with selective attention and path reasoning for document-level relation extraction
    Hang, Tingting
    Feng, Jun
    Wang, Yunfeng
    Yan, Le
    APPLIED INTELLIGENCE, 2024, 54 (07) : 5353 - 5372
  • [46] Feature-Enhanced Document-Level Relation Extraction in Threat Intelligence with Knowledge Distillation
    Li, Yongfei
    Guo, Yuanbo
    Fang, Chen
    Hu, Yongjin
    Liu, Yingze
    Chen, Qingli
    ELECTRONICS, 2022, 11 (22)
  • [47] An adaptive confidence-based data revision framework for Document-level Relation Extraction
    Jiang, Chao
    Liao, Jinzhi
    Zhao, Xiang
    Zeng, Daojian
    Dai, Jianhua
    INFORMATION PROCESSING & MANAGEMENT, 2025, 62 (01)
  • [48] Document-level relation extraction with two-stage dynamic graph attention networks
    Sun, Qi
    Zhang, Kun
    Huang, Kun
    Xu, Tiancheng
    Li, Xun
    Liu, Yaodi
    KNOWLEDGE-BASED SYSTEMS, 2023, 267
  • [49] Dual-Channel and Hierarchical Graph Convolutional Networks for document-level relation extraction
    Sun, Qi
    Xu, Tiancheng
    Zhang, Kun
    Huang, Kun
    Lv, Laishui
    Li, Xun
    Zhang, Ting
    Dore-Natteh, Doris
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 205
  • [50] Enhanced graph convolutional network based on node importance for document-level relation extraction
    Qi Sun
    Kun Zhang
    Kun Huang
    Xun Li
    Ting Zhang
    Tiancheng Xu
    Neural Computing and Applications, 2022, 34 : 15429 - 15439