Chinese Triple Extraction Based on BERT Model

被引:2
|
作者
Deng, Weidong [1 ,2 ]
Liu, Yun [1 ,2 ]
机构
[1] Beijing Jiao Tong Univ, Sch Elect & Informat Engn, Beijing, Peoples R China
[2] Beijing Municipal Commiss Educ, Key Lab Commun & Informat Syst, Beijing, Peoples R China
来源
PROCEEDINGS OF THE 2021 15TH INTERNATIONAL CONFERENCE ON UBIQUITOUS INFORMATION MANAGEMENT AND COMMUNICATION (IMCOM 2021) | 2021年
基金
中国国家自然科学基金;
关键词
triple; information extraction; relation classification; entity tagging; BERT;
D O I
10.1109/IMCOM51814.2021.9377404
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Information extraction (IE) plays a crucial role in natural language processing, which extracts structured facts like entities, attributes, relations and events from unstructured text. The results of information extraction can be applied in many fields including information retrieval, intelligent QA system, to name a few. We define a pair of entities and their relation from a sentence as a triple. Different from most relation extraction tasks, which only extract one relation from a sentence of known entities, we achieved that extracting both relation and entities(a triple, as defined above), from a plain sentence. Until now, there are so many methods proposed to solve information extraction problem and deep learning has made great progress last several years. Among the field of deep learning, the pretrained model BERT has achieved greatly successful results in a lot of NLP tasks. So we divide our triple extraction task into two sub-tasks, relation classification and entity tagging, and design two models based on BERT for these two sub-tasks, including a CNN-BERT and a Simple BERT. We experimented our models on DuIE Chinese dataset and achieved excellent results.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Chinese relation extraction based on lattice network improved with BERT model
    Zhang, Zhengchun
    Yu, Qingsong
    2020 5TH INTERNATIONAL CONFERENCE ON MATHEMATICS AND ARTIFICIAL INTELLIGENCE (ICMAI 2020), 2020, : 98 - 102
  • [2] Chinese Named Entity Recognition Based on BERT and Lightweight Feature Extraction Model
    Yang, Ruisen
    Gan, Yong
    Zhang, Chenfang
    INFORMATION, 2022, 13 (11)
  • [3] Dependency-based BERT for Chinese Event Argument Extraction
    Li, Daiyi
    Yan, Li
    Ma, Zongmin
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (12)
  • [4] BERT-Based Chinese Relation Extraction for Public Security
    Hou, Jiaqi
    Li, Xin
    Yao, Haipeng
    Sun, Haichun
    Mai, Tianle
    Zhu, Rongchen
    IEEE ACCESS, 2020, 8 : 132367 - 132375
  • [5] A joint model for entity and relation extraction based on BERT
    Qiao, Bo
    Zou, Zhuoyang
    Huang, Yu
    Fang, Kui
    Zhu, Xinghui
    Chen, Yiming
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (05) : 3471 - 3481
  • [6] A joint model for entity and relation extraction based on BERT
    Bo Qiao
    Zhuoyang Zou
    Yu Huang
    Kui Fang
    Xinghui Zhu
    Yiming Chen
    Neural Computing and Applications, 2022, 34 : 3471 - 3481
  • [7] Extraction Automatic Abstract Method Based on Bert and Graph Model
    Zhao, Weidong
    Chen, Xiaolu
    Wang, Ming
    PROCEEDINGS OF 2021 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND INFORMATION SYSTEMS (ICAIIS '21), 2021,
  • [8] Sentiment analysis of Chinese stock reviews based on BERT model
    Mingzheng Li
    Lei Chen
    Jing Zhao
    Qiang Li
    Applied Intelligence, 2021, 51 : 5016 - 5024
  • [9] Sentiment analysis of Chinese stock reviews based on BERT model
    Li, Mingzheng
    Chen, Lei
    Zhao, Jing
    Li, Qiang
    APPLIED INTELLIGENCE, 2021, 51 (07) : 5016 - 5024
  • [10] Chinese mineral named entity recognition based on BERT model
    Yu, Yuqing
    Wang, Yuzhu
    Mua, Jingqin
    Li, Wei
    Jiao, Shoutao
    Wang, Zhenhua
    Lv, Pengfei
    Zhu, Yueqin
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 206