KI-HABS: Key Information Guided Hierarchical Abstractive Summarization

被引:2
|
作者
Zhang, Mengli [1 ]
Zhou, Gang [1 ]
Yu, Wanting [1 ]
Liu, Wenfen [2 ]
机构
[1] State Key Lab Math Engn & Adv Comp, Zhengzhou 450001, Henan, Peoples R China
[2] Guilin Univ Elect Technol, Guangxi Key Lab Cryptogp & Informat Secur, Sch Comp Sci & Informat Secur, Guilin 541004, Guangxi, Peoples R China
来源
KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS | 2021年 / 15卷 / 12期
基金
中国国家自然科学基金;
关键词
neural network; deep learning; NLP; abstractive summarization; selective encoding;
D O I
10.3837/tiis.2021.12.001
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the unprecedented growth of textual information on the Internet, an efficient automatic summarization system has become an urgent need. Recently, the neural network models based on the encoder-decoder with an attention mechanism have demonstrated powerful capabilities in the sentence summarization task. However, for paragraphs or longer document summarization, these models fail to mine the core information in the input text, which leads to information loss and repetitions. In this paper, we propose an abstractive document summarization method by applying guidance signals of key sentences to the encoder based on the hierarchical encoder-decoder architecture, denoted as KI-HABS. Specifically, we first train an extractor to extract key sentences in the input document by the hierarchical bidirectional GRU. Then, we encode the key sentences to the key information representation in the sentence level. Finally, we adopt key information representation guided selective encoding strategies to filter source information, which establishes a connection between the key sentences and the document. We use the CNN/Daily Mail and Gigaword datasets to evaluate our model. The experimental results demonstrate that our method generates more informative and concise summaries, achieving better performance than the competitive models.
引用
收藏
页码:4275 / 4291
页数:17
相关论文
共 27 条
  • [1] Key phrase aware transformer for abstractive summarization
    Liu, Shuaiqi
    Cao, Jiannong
    Yang, Ruosong
    Wen, Zhiyuan
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (03)
  • [2] Keyword-guided abstractive code summarization via incorporating structural and contextual information
    Cheng, Wuyan
    Hu, Po
    Wei, Shaozhi
    Mo, Ran
    INFORMATION AND SOFTWARE TECHNOLOGY, 2022, 150
  • [3] Keyword-guided abstractive code summarization via incorporating structural and contextual information
    Cheng, Wuyan
    Hu, Po
    Wei, Shaozhi
    Mo, Ran
    INFORMATION AND SOFTWARE TECHNOLOGY, 2022, 150
  • [4] Multi-task learning for abstractive text summarization with key information guide network
    Xu, Weiran
    Li, Chenliang
    Lee, Minghao
    Zhang, Chi
    EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING, 2020, 2020 (01)
  • [5] Multi-task learning for abstractive text summarization with key information guide network
    Weiran Xu
    Chenliang Li
    Minghao Lee
    Chi Zhang
    EURASIP Journal on Advances in Signal Processing, 2020
  • [6] Frame Semantics guided network for Abstractive Sentence Summarization
    Guan, Yong
    Guo, Shaoru
    Li, Ru
    Li, Xiaoli
    Zhang, Hu
    KNOWLEDGE-BASED SYSTEMS, 2021, 221
  • [7] Hie-Transformer: A Hierarchical Hybrid Transformer for Abstractive Article Summarization
    Zhang, Xuewen
    Meng, Kui
    Liu, Gongshen
    NEURAL INFORMATION PROCESSING (ICONIP 2019), PT III, 2019, 11955 : 248 - 258
  • [8] Enhancing abstractive summarization of scientific papers using structure information
    Bao, Tong
    Zhang, Heng
    Zhang, Chengzhi
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 261
  • [9] A two-step abstractive summarization model with asynchronous and enriched-information decoding
    Li, Shuaimin
    Xu, Jungang
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (04) : 1159 - 1170
  • [10] A two-step abstractive summarization model with asynchronous and enriched-information decoding
    Shuaimin Li
    Jungang Xu
    Neural Computing and Applications, 2021, 33 : 1159 - 1170