FA-RCNet: A Fused Feature Attention Network for Relationship Classification

被引:1
作者
Tian, Jiakai [1 ]
Li, Gang [1 ]
Zhou, Mingle [1 ]
Li, Min [1 ]
Han, Delong [1 ]
机构
[1] Qilu Univ Technol, Shandong Acad Sci, Shandong Comp Sci Ctr, Natl Supercomp Ctr Jinan, Jinan 250316, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2022年 / 12卷 / 23期
关键词
relationship classification; attentional mechanisms; feature fusion;
D O I
10.3390/app122312460
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Relation extraction is an important task in natural language processing. It plays an integral role in intelligent question-and-answer systems, semantic search, and knowledge graph work. For this task, previous studies have demonstrated the effectiveness of convolutional neural networks (CNNs), recurrent neural networks (RNNs), and long short-term memory networks (LSTMs) in relational classification tasks. Recently, due to the superior performance of the pre-trained model BERT, BERT has become a feature extraction module for many relational classification models, and good results have been achieved in work related to BERT. However, most of such work uses the deepest levels of features. The important role of shallow-level information in the relational classification task is ignored. Based on the above problems, a relationship classification network FA-RCNet (fusion-attention relationship classification network) with feature fusion and attention mechanism is proposed in this paper. FA-RCNet fuses shallow-level features with deep-level features, and augments entity features and global features by the attention module so that the feature vector can perform the relational classification task more perfectly. In addition, the model in this paper achieves advanced results on both the SemEval-2010 Task 8 dataset and the KBP37 dataset compared to previously published models.
引用
收藏
页数:19
相关论文
共 51 条
  • [1] A new sentence similarity measure and sentence based extractive technique for automatic text summarization
    Aliguliyev, Ramiz M.
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (04) : 7764 - 7772
  • [2] Bahdanau D., 2016, ARXIV
  • [3] An Attentive Survey of Attention Models
    Chaudhari, Sneha
    Mithal, Varun
    Polatkan, Gungor
    Ramanath, Rohan
    [J]. ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2021, 12 (05)
  • [4] Devlin Jacob, 2018, ANN C N AM CHAPTER A
  • [5] 3Global Second-order Pooling Convolutional Networks
    Gao, Zilin
    Xie, Jiangtao
    Wang, Qilong
    Li, Peihua
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 3019 - 3028
  • [6] Novel target attention convolutional neural network for relation classification
    Geng, Zhiqiang
    Li, Jun
    Han, Yongming
    Zhang, Yanhui
    [J]. INFORMATION SCIENCES, 2022, 597 : 24 - 37
  • [7] Girju R., 2003, AUTOMATIC DETECTION
  • [8] Hendrickx I., 2009, PROC WORKSHOP SEMANT, P94, DOI 10.3115/1621969.1621986
  • [9] Hou Q.B., 2021, arXiv
  • [10] Hu Jie, 2019, ARXIV