An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori Knowledge

被引:0
|
作者
Li, Yuanyuan [1 ]
Huang, Yuan [1 ]
Huang, Weijian [1 ]
Yu, Junhao [1 ]
Huang, Zheng [1 ]
机构
[1] Hebei Univ Engn, Sch Informat & Elect Engn, Handan 056038, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 07期
关键词
abstractive summarization; joint-attention mechanism; prior knowledge; reinforcement learning;
D O I
10.3390/app13074610
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
An abstractive summarization model based on the joint-attention mechanism and a priori knowledge is proposed to address the problems of the inadequate semantic understanding of text and summaries that do not conform to human language habits in abstractive summary models. Word vectors that are most relevant to the original text should be selected first. Second, the original text is represented in two dimensions-word-level and sentence-level, as word vectors and sentence vectors, respectively. After this processing, there will be not only a relationship between word-level vectors but also a relationship between sentence-level vectors, and the decoder discriminates between word-level and sentence-level vectors based on their relationship with the hidden state of the decoder. Then, the pointer generation network is improved using a priori knowledge. Finally, reinforcement learning is used to improve the quality of the generated summaries. Experiments on two classical datasets, CNN/DailyMail and DUC 2004, show that the model has good performance and effectively improves the quality of generated summaries.
引用
收藏
页数:19
相关论文
共 26 条
  • [1] Abstractive Document Summarization via Neural Model with Joint Attention
    Hou, Liwei
    Hu, Po
    Bei, Chao
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2017, 2018, 10619 : 329 - 338
  • [2] Attention based Abstractive Summarization of Malayalam Document
    Nambiar, Sindhya K.
    Peter, David S.
    Idicula, Sumam Mary
    AI IN COMPUTATIONAL LINGUISTICS, 2021, 189 : 250 - 257
  • [3] Abstractive text summarization model combining a hierarchical attention mechanism and multiobjective reinforcement learning
    Sun, Yujia
    Platos, Jan
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 248
  • [4] Neural attention model with keyword memory for abstractive document summarization
    Choi, YunSeok
    Kim, Dahae
    Lee, Jee-Hyong
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2020, 32 (18)
  • [5] Abstractive Summarization by Neural Attention Model with Document Content Memory
    Choi, Yunseok
    Kim, Dahae
    Lee, Jee-Hyong
    PROCEEDINGS OF THE 2018 CONFERENCE ON RESEARCH IN ADAPTIVE AND CONVERGENT SYSTEMS (RACS 2018), 2018, : 11 - 16
  • [6] A Convolution-Self Attention Abstractive Summarization Method Fusing Sequential Grammar Knowledge
    Luo S.
    Wang R.
    Wu Q.
    Pan L.
    Wu Z.
    Beijing Ligong Daxue Xuebao/Transaction of Beijing Institute of Technology, 2021, 41 (01): : 93 - 101
  • [7] Neural Attention Model for Abstractive Text Summarization Using Linguistic Feature Space
    Dilawari, Aniqa
    Khan, Muhammad Usman Ghani
    Saleem, Summra
    Zahoor-Ur-Rehman
    Shaikh, Fatema Sabeen
    IEEE ACCESS, 2023, 11 : 23557 - 23564
  • [8] ASM: Augmentation-based Semantic Mechanism on Abstractive Summarization
    Ren, Weidong
    Zhou, Hao
    Liu, Gongshen
    Huan, Fei
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [9] KAAS: A Keyword-Aware Attention Abstractive Summarization Model for Scientific Articles
    Li, Shuaimin
    Xu, Jungang
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT III, 2022, : 263 - 271
  • [10] HITS-based attentional neural model for abstractive summarization
    Cai, Xiaoyan
    Shi, Kaile
    Jiang, Yuehan
    Yang, Libin
    Liu, Sen
    KNOWLEDGE-BASED SYSTEMS, 2021, 222