A Combined Extractive With Abstractive Model for Summarization

被引:9
|
作者
Liu, Wenfeng [1 ]
Gao, Yaling [1 ]
Li, Jinming [1 ]
Yang, Yuzhen [1 ]
机构
[1] Heze Univ, Sch Comp, Heze 274015, Peoples R China
关键词
Syntactics; Feature extraction; Semantics; Reinforcement learning; Neural networks; Licenses; Deep learning; Extractive summarization; abstractive summarization; beam search; word embeddings;
D O I
10.1109/ACCESS.2021.3066484
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Aiming at the difficulties in document-level summarization, this paper presents a two-stage, extractive and then abstractive summarization model. In the first stage, we extract the important sentences by combining sentences similarity matrix (only used for the first time) or pseudo-title, which takes full account of the features (such as sentence position, paragraph position, and more.). To extract coarse-grained sentences from a document, and considers the sentence differentiation for the most important sentences in the document. The second stage is abstractive, and we use beam search algorithm to restructure and rewrite these syntactic blocks of these extracted sentences. Newly generated summary sentence serves as the pseudo-summary of the next round. Globally optimal pseudo-title acts as the final summarization. Extensive experiments have been performed on the corresponding data set, and the results show our model can obtain better results.
引用
收藏
页码:43970 / 43980
页数:11
相关论文
共 50 条
  • [1] Abstractive vs. Extractive Summarization: An Experimental Review
    Giarelis, Nikolaos
    Mastrokostas, Charalampos
    Karacapilidis, Nikos
    APPLIED SCIENCES-BASEL, 2023, 13 (13):
  • [2] Assessing Abstractive and Extractive Methods for Automatic News Summarization
    Oliveira, Hilario
    Lins, Rafael Dueire
    PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON DOCUMENT ENGINEERING, DOCENG 2024, 2024,
  • [3] Abstractive Summarization Improved by WordNet-Based Extractive Sentences
    Xie, Niantao
    Li, Sujian
    Ren, Huiling
    Zhai, Qibin
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT I, 2018, 11108 : 404 - 415
  • [4] Extractive Elementary Discourse Units for Improving Abstractive Summarization
    Xiong, Ye
    Racharak, Teeradaj
    Minh Le Nguyen
    PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 2675 - 2679
  • [5] Sentence Pair Embeddings Based Evaluation Metric for Abstractive and Extractive Summarization
    Akula, Ramya
    Garibay, Ivan
    LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 6009 - 6017
  • [6] Combined Objective Function in Deep Learning Model for Abstractive Summarization
    Le, Tung
    Nguyen Le Minh
    PROCEEDINGS OF THE NINTH INTERNATIONAL SYMPOSIUM ON INFORMATION AND COMMUNICATION TECHNOLOGY (SOICT 2018), 2018, : 84 - 91
  • [7] Reinforced Abstractive Text Summarization With Semantic Added Reward
    Jang, Heewon
    Kim, Wooju
    IEEE ACCESS, 2021, 9 : 103804 - 103810
  • [8] A Faster Method For Generating Chinese Text Summaries-Combining Extractive Summarization And Abstractive Summarization
    Yang, Wenchuan
    Gu, Tianyu
    Sui, Runqi
    2022 5TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND NATURAL LANGUAGE PROCESSING, MLNLP 2022, 2022, : 54 - 58
  • [9] StarSum: A Star Architecture Based Model for Extractive Summarization
    Shi, Kaile
    Cai, Xiaoyan
    Yang, Libin
    Zhao, Jintao
    Pan, Shirui
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2022, 30 : 3020 - 3031
  • [10] Abstractive Summarization Model with Adaptive Sparsemax
    Guo, Shiqi
    Si, Yumeng
    Zhao, Jing
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT I, 2022, 13551 : 810 - 821