A Combined Extractive With Abstractive Model for Summarization

被引:9
|
作者
Liu, Wenfeng [1 ]
Gao, Yaling [1 ]
Li, Jinming [1 ]
Yang, Yuzhen [1 ]
机构
[1] Heze Univ, Sch Comp, Heze 274015, Peoples R China
关键词
Syntactics; Feature extraction; Semantics; Reinforcement learning; Neural networks; Licenses; Deep learning; Extractive summarization; abstractive summarization; beam search; word embeddings;
D O I
10.1109/ACCESS.2021.3066484
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Aiming at the difficulties in document-level summarization, this paper presents a two-stage, extractive and then abstractive summarization model. In the first stage, we extract the important sentences by combining sentences similarity matrix (only used for the first time) or pseudo-title, which takes full account of the features (such as sentence position, paragraph position, and more.). To extract coarse-grained sentences from a document, and considers the sentence differentiation for the most important sentences in the document. The second stage is abstractive, and we use beam search algorithm to restructure and rewrite these syntactic blocks of these extracted sentences. Newly generated summary sentence serves as the pseudo-summary of the next round. Globally optimal pseudo-title acts as the final summarization. Extensive experiments have been performed on the corresponding data set, and the results show our model can obtain better results.
引用
收藏
页码:43970 / 43980
页数:11
相关论文
共 50 条
  • [41] Neural abstractive summarization fusing by global generative topics
    Gao, Yang
    Wang, Yang
    Liu, Luyang
    Guo, Yidi
    Huang, Heyan
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (09) : 5049 - 5058
  • [42] Reducing repetition in convolutional abstractive summarization
    Liu, Yizhu
    Chen, Xinyue
    Luo, Xusheng
    Zhu, Kenny Q.
    NATURAL LANGUAGE ENGINEERING, 2023, 29 (01) : 81 - 109
  • [43] The Combination of Similarity Measures for Extractive Summarization
    Hy Nguyen
    Tung Le
    Viet-Thang Luong
    Minh-Quoc Nghiem
    Dien Dinh
    PROCEEDINGS OF THE SEVENTH SYMPOSIUM ON INFORMATION AND COMMUNICATION TECHNOLOGY (SOICT 2016), 2016, : 66 - 72
  • [44] Dual Encoding for Abstractive Text Summarization
    Yao, Kaichun
    Zhang, Libo
    Du, Dawei
    Luo, Tiejian
    Tao, Lili
    Wu, Yanjun
    IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (03) : 985 - 996
  • [45] Brain-model neural similarity reveals abstractive summarization performance
    Zhang, Zhejun
    Guo, Shaoting
    Zhou, Wenqing
    Luo, Yingying
    Zhu, Yingqi
    Zhang, Lin
    Li, Lei
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [46] Diverse Decoding for Abstractive Document Summarization
    Han, Xu-Wang
    Zheng, Hai-Tao
    Chen, Jin-Yuan
    Zhao, Cong-Zhi
    APPLIED SCIENCES-BASEL, 2019, 9 (03):
  • [47] Towards Sequence-to-Sequence Neural Model for Croatian Abstractive Summarization
    Davidovic, Vlatka
    Ipsic, Sanda Martincic
    CENTRAL EUROPEAN CONFERENCE ON INFORMATION AND INTELLIGENT SYSTEMS, CECIIS, 2023, : 309 - 315
  • [48] Attentional Extractive Summarization
    Gonzalez, Jose Angel
    Segarra, Encarna
    Garcia-Granada, Fernando
    Sanchis, Emilio
    Hurtado, Lluis-F.
    APPLIED SCIENCES-BASEL, 2023, 13 (03):
  • [49] ASoVS: Abstractive Summarization of Video Sequences
    Dilawari, Aniqa
    Khan, Muhammad Usman Ghani
    IEEE ACCESS, 2019, 7 : 29253 - 29263
  • [50] A Study on Ontology based Abstractive Summarization
    Mohan, Jishma M.
    Sunitha, C.
    Ganesh, Amal
    Jaya, A.
    FOURTH INTERNATIONAL CONFERENCE ON RECENT TRENDS IN COMPUTER SCIENCE & ENGINEERING (ICRTCSE 2016), 2016, 87 : 32 - 37