Document-Level Relation Extraction with a Dependency Syntax Transformer and Supervised Contrastive Learning

被引:0
作者
Yang, Ming [1 ]
Zhang, Yijia [1 ]
Banbhrani, Santosh Kumar [2 ]
Lin, Hongfei [2 ]
Lu, Mingyu [1 ]
机构
[1] Dalian Maritime Univ, Sch Informat Sci & Technol, Dalian 116024, Liaoning, Peoples R China
[2] Dalian Univ Technol, Sch Comp Sci & Technol, Dalian 116023, Lioaoning, Peoples R China
来源
KNOWLEDGE GRAPH AND SEMANTIC COMPUTING: KNOWLEDGE GRAPH EMPOWERS THE DIGITAL ECONOMY, CCKS 2022 | 2022年 / 1669卷
关键词
Document-level relation extraction; Transformer model; Dependency syntax; Superviesed contrastive learning; Gaussian probability;
D O I
10.1007/978-981-19-7596-7_4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Document-level Relation Extraction is more challenging than its sentence-level counterpart, extracting unknown relational facts from a plain text at the document level. Studies have shown that the Transformer architecture models long-distance dependencies without regard to the syntax-level dependencies between tokens in the sequence, which hinders its ability to model long-range dependencies. Furthermore, the global information among relational triples and local information around entities is critical. In this paper, we propose a Dependency Syntax Transformer and Supervised Contrastive Learning model (DSTSC) for document-level relation extraction. Specifically, dependency syntax information guides Transformer to enhance attention between tokens with dependency syntax relation in the sequence. The ability of Transformer to model document-level dependencies is improved. Supervised contrastive learning with fusion knowledge captures global information among relational triples. Gaussian probability distributions are also designed to capture local information around entities. Our experiments on two document-level relation extraction datasets, CDR and GDA, have remarkable results.
引用
收藏
页码:43 / 54
页数:12
相关论文
共 16 条
[1]  
Beltagy I, 2019, 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019), P3615
[2]  
Christopoulou F, 2019, 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019), P4925
[3]  
Guo ZJ, 2019, 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), P241
[4]  
Le HQ, 2017, INT CONF KNOWL SYS, P292, DOI 10.1109/KSE.2017.8119474
[5]  
Kambhatla N, 2004, P ACL 2004 INT POST, P22
[6]  
Khosla P, 2020, ADV NEUR IN, V33
[7]   Exploiting sequence labeling framework to extract document-level relations from biomedical texts [J].
Li, Zhiheng ;
Yang, Zhihao ;
Xiang, Yang ;
Luo, Ling ;
Sun, Yuanyuan ;
Lin, Hongfei .
BMC BIOINFORMATICS, 2020, 21 (01)
[8]   Improve relation extraction with dual attention-guided graph convolutional networks [J].
Li, Zhixin ;
Sun, Yaru ;
Zhu, Jianwei ;
Tang, Suqin ;
Zhang, Canlong ;
Ma, Huifang .
NEURAL COMPUTING & APPLICATIONS, 2021, 33 (06) :1773-1784
[9]  
Nan GS, 2020, 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), P1546
[10]  
Su P., 2021, P 20 WORKSH BIOM LAN, P1, DOI DOI 10.18653/V1/2021.BIONLP-1.1