A Combined Extractive With Abstractive Model for Summarization

被引:9
|
作者
Liu, Wenfeng [1 ]
Gao, Yaling [1 ]
Li, Jinming [1 ]
Yang, Yuzhen [1 ]
机构
[1] Heze Univ, Sch Comp, Heze 274015, Peoples R China
关键词
Syntactics; Feature extraction; Semantics; Reinforcement learning; Neural networks; Licenses; Deep learning; Extractive summarization; abstractive summarization; beam search; word embeddings;
D O I
10.1109/ACCESS.2021.3066484
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Aiming at the difficulties in document-level summarization, this paper presents a two-stage, extractive and then abstractive summarization model. In the first stage, we extract the important sentences by combining sentences similarity matrix (only used for the first time) or pseudo-title, which takes full account of the features (such as sentence position, paragraph position, and more.). To extract coarse-grained sentences from a document, and considers the sentence differentiation for the most important sentences in the document. The second stage is abstractive, and we use beam search algorithm to restructure and rewrite these syntactic blocks of these extracted sentences. Newly generated summary sentence serves as the pseudo-summary of the next round. Globally optimal pseudo-title acts as the final summarization. Extensive experiments have been performed on the corresponding data set, and the results show our model can obtain better results.
引用
收藏
页码:43970 / 43980
页数:11
相关论文
共 50 条
  • [31] Summarizing judicial documents: a hybrid extractive- abstractive model with legal domain knowledge
    Gao, Yan
    Wu, Jie
    Liu, Zhengtao
    Li, Juan
    ARTIFICIAL INTELLIGENCE AND LAW, 2025,
  • [32] Improving Abstractive Summarization with Unsupervised Dynamic LoRA Mixtures
    Chernyshev, D. I.
    LOBACHEVSKII JOURNAL OF MATHEMATICS, 2024, 45 (07) : 2995 - 3006
  • [33] Warm-Starting for Improving the Novelty of Abstractive Summarization
    Alomari, Ayham
    Al-Shamayleh, Ahmad Sami
    Idris, Norisma
    Qalid Md Sabri, Aznul
    Alsmadi, Izzat
    Omary, Danah
    IEEE ACCESS, 2023, 11 : 112483 - 112501
  • [34] Introducing bidirectional attention for autoregressive models in abstractive summarization
    Zhao, Jianfei
    Sun, Xin
    Feng, Chong
    INFORMATION SCIENCES, 2025, 689
  • [35] Abstractive text summarization: State of the art, challenges, and improvements
    Shakil, Hassan
    Farooq, Ahmad
    Kalita, Jugal
    NEUROCOMPUTING, 2024, 603
  • [36] Neural abstractive summarization fusing by global generative topics
    Yang Gao
    Yang Wang
    Luyang Liu
    Yidi Guo
    Heyan Huang
    Neural Computing and Applications, 2020, 32 : 5049 - 5058
  • [37] Model of abstractive text summarization for topic-aware communicating agents
    Zhang Z.
    Ren S.
    Guo K.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2020, 47 (03): : 97 - 104
  • [38] An Optimized Abstractive Text Summarization Model Using Peephole Convolutional LSTM
    Rahman, Md Motiur
    Siddiqui, Fazlul Hasan
    SYMMETRY-BASEL, 2019, 11 (10):
  • [39] Learning Cluster Patterns for Abstractive Summarization
    Jo, Sung-Guk
    Park, Seung-Hyeok
    Kim, Jeong-Jae
    On, Byung-Won
    IEEE ACCESS, 2023, 11 : 146065 - 146075
  • [40] ABSUM: ABstractive SUMmarization of Lecture Videos
    Devi, M. S. Karthika
    Bhuvaneshwari, R.
    Baskaran, R.
    SMART TRENDS IN COMPUTING AND COMMUNICATIONS, VOL 3, SMARTCOM 2024, 2024, 947 : 237 - 248