A Combined Extractive With Abstractive Model for Summarization

被引:9
|
作者
Liu, Wenfeng [1 ]
Gao, Yaling [1 ]
Li, Jinming [1 ]
Yang, Yuzhen [1 ]
机构
[1] Heze Univ, Sch Comp, Heze 274015, Peoples R China
关键词
Syntactics; Feature extraction; Semantics; Reinforcement learning; Neural networks; Licenses; Deep learning; Extractive summarization; abstractive summarization; beam search; word embeddings;
D O I
10.1109/ACCESS.2021.3066484
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Aiming at the difficulties in document-level summarization, this paper presents a two-stage, extractive and then abstractive summarization model. In the first stage, we extract the important sentences by combining sentences similarity matrix (only used for the first time) or pseudo-title, which takes full account of the features (such as sentence position, paragraph position, and more.). To extract coarse-grained sentences from a document, and considers the sentence differentiation for the most important sentences in the document. The second stage is abstractive, and we use beam search algorithm to restructure and rewrite these syntactic blocks of these extracted sentences. Newly generated summary sentence serves as the pseudo-summary of the next round. Globally optimal pseudo-title acts as the final summarization. Extensive experiments have been performed on the corresponding data set, and the results show our model can obtain better results.
引用
收藏
页码:43970 / 43980
页数:11
相关论文
共 50 条
  • [21] A Hierarchical Representation Model Based on Longformer and Transformer for Extractive Summarization
    Yang, Shihao
    Zhang, Shaoru
    Fang, Ming
    Yang, Fengqin
    Liu, Shuhua
    ELECTRONICS, 2022, 11 (11)
  • [22] Highlighted Word Encoding for Abstractive Text Summarization
    Lal, Daisy Monika
    Singh, Krishna Pratap
    Tiwary, Uma Shanker
    INTELLIGENT HUMAN COMPUTER INTERACTION (IHCI 2019), 2020, 11886 : 77 - 86
  • [23] Key phrase aware transformer for abstractive summarization
    Liu, Shuaiqi
    Cao, Jiannong
    Yang, Ruosong
    Wen, Zhiyuan
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (03)
  • [24] Neural attention model with keyword memory for abstractive document summarization
    Choi, YunSeok
    Kim, Dahae
    Lee, Jee-Hyong
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2020, 32 (18)
  • [25] SEASum: Syntax-Enriched Abstractive Summarization
    Liu, Sen
    Yang, Libin
    Cai, Xiaoyan
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 199
  • [26] Abstractive Text Summarization Using Hybrid Technique of Summarization
    Liaqat, Muhammad Irfan
    Hamid, Isma
    Nawaz, Qamar
    Shafique, Nida
    2022 14TH INTERNATIONAL CONFERENCE ON COMMUNICATION SOFTWARE AND NETWORKS (ICCSN 2022), 2022, : 141 - 144
  • [27] English-Arabic Text Translation and Abstractive Summarization Using Transformers
    Holiel, Heidi Ahmed
    Mohamed, Nancy
    Ahmed, Arwa
    Medhat, Walaa
    2023 20TH ACS/IEEE INTERNATIONAL CONFERENCE ON COMPUTER SYSTEMS AND APPLICATIONS, AICCSA, 2023,
  • [28] Abstractive Summarization by Neural Attention Model with Document Content Memory
    Choi, Yunseok
    Kim, Dahae
    Lee, Jee-Hyong
    PROCEEDINGS OF THE 2018 CONFERENCE ON RESEARCH IN ADAPTIVE AND CONVERGENT SYSTEMS (RACS 2018), 2018, : 11 - 16
  • [29] Abstractive Document Summarization via Neural Model with Joint Attention
    Hou, Liwei
    Hu, Po
    Bei, Chao
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2017, 2018, 10619 : 329 - 338
  • [30] HITS-based attentional neural model for abstractive summarization
    Cai, Xiaoyan
    Shi, Kaile
    Jiang, Yuehan
    Yang, Libin
    Liu, Sen
    KNOWLEDGE-BASED SYSTEMS, 2021, 222