Chinese Triple Extraction Based on BERT Model

被引:2
|
作者
Deng, Weidong [1 ,2 ]
Liu, Yun [1 ,2 ]
机构
[1] Beijing Jiao Tong Univ, Sch Elect & Informat Engn, Beijing, Peoples R China
[2] Beijing Municipal Commiss Educ, Key Lab Commun & Informat Syst, Beijing, Peoples R China
来源
PROCEEDINGS OF THE 2021 15TH INTERNATIONAL CONFERENCE ON UBIQUITOUS INFORMATION MANAGEMENT AND COMMUNICATION (IMCOM 2021) | 2021年
基金
中国国家自然科学基金;
关键词
triple; information extraction; relation classification; entity tagging; BERT;
D O I
10.1109/IMCOM51814.2021.9377404
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Information extraction (IE) plays a crucial role in natural language processing, which extracts structured facts like entities, attributes, relations and events from unstructured text. The results of information extraction can be applied in many fields including information retrieval, intelligent QA system, to name a few. We define a pair of entities and their relation from a sentence as a triple. Different from most relation extraction tasks, which only extract one relation from a sentence of known entities, we achieved that extracting both relation and entities(a triple, as defined above), from a plain sentence. Until now, there are so many methods proposed to solve information extraction problem and deep learning has made great progress last several years. Among the field of deep learning, the pretrained model BERT has achieved greatly successful results in a lot of NLP tasks. So we divide our triple extraction task into two sub-tasks, relation classification and entity tagging, and design two models based on BERT for these two sub-tasks, including a CNN-BERT and a Simple BERT. We experimented our models on DuIE Chinese dataset and achieved excellent results.
引用
收藏
页数:5
相关论文
共 50 条
  • [21] Chinese Evaluation Phrase Extraction Based on Cascaded Model
    Wang, Yashen
    Feng, Chong
    Liu, Quanchao
    Huang, Heyan
    WEB-AGE INFORMATION MANAGEMENT, WAIM 2014, 2014, 8485 : 192 - 203
  • [22] DdERT: Research on Named Entity Recognition for Mine Hoist Using a Chinese BERT Model
    Dang, Xiaochao
    Wang, Li
    Dong, Xiaohui
    Li, Fenfang
    Deng, Han
    ELECTRONICS, 2023, 12 (19)
  • [23] Investigation of BERT Model on Biomedical Relation Extraction Based on Revised Fine-tuning Mechanism
    Su, Peng
    Vijay-Shanker, K.
    2020 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2020, : 2522 - 2529
  • [24] CABiLSTM-BERT: Aspect-based sentiment analysis model based on deep implicit feature extraction
    He, Bo
    Zhao, Ruoyu
    Tang, Dali
    KNOWLEDGE-BASED SYSTEMS, 2025, 309
  • [25] The Exploration of the Reasoning Capability of BERT in Relation Extraction
    Li, Lili
    Xin, Xin
    Guo, Ping
    2020 10TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND TECHNOLOGY (ICIST), 2020, : 219 - 228
  • [26] A tag based joint extraction model for Chinese medical text
    Liu, XingYu
    Liu, Yu
    Wu, HangYu
    Guan, QingQuan
    COMPUTATIONAL BIOLOGY AND CHEMISTRY, 2021, 93
  • [27] Fine-tuning BERT for Joint Entity and Relation Extraction in Chinese Medical Text
    Xue, Kui
    Zhou, Yangming
    Ma, Zhiyuan
    Ruan, Tong
    Zhang, Huanhuan
    He, Ping
    2019 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2019, : 892 - 897
  • [28] A Deep Learning Model Based on BERT and Sentence Transformer for Semantic Keyphrase Extraction on Big Social Data
    Devika, R.
    Vairavasundaram, Subramaniyaswamy
    Mahenthar, C. Sakthi Jay
    Varadarajan, Vijayakumar
    Kotecha, Ketan
    IEEE ACCESS, 2021, 9 : 165252 - 165261
  • [29] Research on BERT-Based Audit Entity Extraction Method
    Xiang, Rui
    Li, Weibo
    Yan, Hua
    2021 4TH INTERNATIONAL CONFERENCE ON ROBOTICS, CONTROL AND AUTOMATION ENGINEERING (RCAE 2021), 2021, : 176 - 180
  • [30] RE-BERT: Automatic Extraction of Software Requirements from App Reviews using BERT Language Model
    de Araujo, Adailton Ferreira
    Marcacini, Ricardo Marcondes
    36TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2021, 2021, : 1321 - 1327