Infusing Dependency Syntax Information into a Transformer Model for Document-Level Relation Extraction from Biomedical Literature

被引:0
|
作者
Yang, Ming [1 ]
Zhang, Yijia [1 ]
Liu, Da [1 ]
Du, Wei [1 ]
Di, Yide [1 ]
Lin, Hongfei [2 ]
机构
[1] Dalian Maritime Univ, Sch Informat Sci & Technol, Dalian 116024, Liaoning, Peoples R China
[2] Dalian Univ Technol, Sch Comp Sci & Technol, Dalian 116023, Lioaoning, Peoples R China
来源
HEALTH INFORMATION PROCESSING, CHIP 2022 | 2023年 / 1772卷
关键词
Document-level relation extraction; Dependency syntax information; Transformer model; Attention mechanism; Biomedical literature;
D O I
10.1007/978-981-19-9865-2_3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In biomedical domain, document-level relation extraction is a challenging task that offers a new and more effective approach for long and complex text mining. Studies have shown that the Transformer models the dependencies of any two tokens without regard to their syntax-level dependency in the sequence. In this work, we propose a Dependency Syntax Transformer Model, i.e., the DSTM model, to improve the Transformer's ability in long-range modeling dependencies. Three methods are proposed for introducing dependency syntax information into the Transformer to enhance the attention of tokens with dependencies in a sentence. The dependency syntax Transformer model improves the Transformer's ability to handle long text in document-level relation extraction. Our experimental results on the document-level relation extraction dataset CDR in the biomedical field prove the validity of the DSTM model, and the experimental results on the generic domain dataset DocRED prove the universality.
引用
收藏
页码:37 / 52
页数:16
相关论文
共 50 条
  • [1] Document-Level Relation Extraction with a Dependency Syntax Transformer and Supervised Contrastive Learning
    Yang, Ming
    Zhang, Yijia
    Banbhrani, Santosh Kumar
    Lin, Hongfei
    Lu, Mingyu
    KNOWLEDGE GRAPH AND SEMANTIC COMPUTING: KNOWLEDGE GRAPH EMPOWERS THE DIGITAL ECONOMY, CCKS 2022, 2022, 1669 : 43 - 54
  • [2] Document-Level Relation Extraction with Local Relation and Global Inference
    Liu, Yiming
    Shan, Hongtao
    Nie, Feng
    Zhang, Gaoyu
    Yuan, George Xianzhi
    INFORMATION, 2023, 14 (07)
  • [3] Biomedical document-level relation extraction with thematic capture and localized entity pooling
    Li, Yuqing
    Shao, Xinhui
    JOURNAL OF BIOMEDICAL INFORMATICS, 2024, 160
  • [4] Advancing document-level relation extraction with a syntax-enhanced multi-hop reasoning network
    Zhong Y.
    Shen B.
    Wang T.
    Journal of Intelligent and Fuzzy Systems, 2024, 46 (04) : 9155 - 9171
  • [5] Document-Level Relation Extraction with Additional Evidence and Entity Type Information
    Li, Jinliang
    Wang, Junlei
    Li, Canyu
    Liu, Xiaojing
    Feng, Zaiwen
    Qin, Li
    Mayer, Wolfgang
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT III, ICIC 2024, 2024, 14877 : 226 - 237
  • [6] Document-Level Relation Extraction with Path Reasoning
    Xu, Wang
    Chen, Kehai
    Zhao, Tiejun
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (04)
  • [7] Document-level relation extraction with three channels
    Zhang, Zhanjun
    Zhao, Shan
    Zhang, Haoyu
    Wan, Qian
    Liu, Jie
    KNOWLEDGE-BASED SYSTEMS, 2024, 284
  • [8] DoreBer: Document-Level Relation Extraction Method Based on BernNet
    Yuan, Boya
    Xu, Liwen
    IEEE ACCESS, 2023, 11 : 136468 - 136477
  • [9] Document-level relation extraction via graph transformer networks and temporal convolutional networks
    Shi, Yong
    Xiao, Yang
    Quan, Pei
    Lei, MingLong
    Niu, Lingfeng
    PATTERN RECOGNITION LETTERS, 2021, 149 : 150 - 156
  • [10] Understanding More Knowledge Makes the Transformer Perform Better in Document-level Relation Extraction
    Chen, Haotian
    Chen, Yijiang
    Zhou, Xiangdong
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 222, 2023, 222