Improving Transformer with Sequential Context Representations for Abstractive Text Summarization

被引:21
|
作者
Cai, Tian [1 ,2 ]
Shen, Mengjun [1 ,2 ]
Peng, Huailiang [1 ,2 ]
Jiang, Lei [1 ]
Dai, Qiong [1 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing, Peoples R China
来源
NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I | 2019年 / 11838卷
基金
美国国家科学基金会;
关键词
Transformer; Abstractive summarization;
D O I
10.1007/978-3-030-32233-5_40
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent dominant approaches for abstractive text summarization are mainly RNN-based encoder-decoder framework, these methods usually suffer from the poor semantic representations for long sequences. In this paper, we propose a new abstractive summarization model, called RC-Transformer (RCT). The model is not only capable of learning longterm dependencies, but also addresses the inherent shortcoming of Transformer on insensitivity to word order information. We extend the Transformer with an additional RNN-based encoder to capture the sequential context representations. In order to extract salient information effectively, we further construct a convolution module to filter the sequential context with local importance. The experimental results on Gigaword and DUC-2004 datasets show that our proposed model achieves the state-of-the-art performance, even without introducing external information. In addition, our model also owns an advantage in speed over the RNN-based models.
引用
收藏
页码:512 / 524
页数:13
相关论文
共 50 条
  • [41] Improving named entity correctness of abstractive summarization by generative negative sampling
    Chen, Zheng
    Lin, Hongyu
    COMPUTER SPEECH AND LANGUAGE, 2023, 81
  • [42] Improving abstractive summarization based on dynamic residual network with reinforce dependency
    Liao, Weizhi
    Ma, Yaheng
    Yin, Yanchao
    Ye, Guanglei
    Zuo, Dongzhou
    NEUROCOMPUTING, 2021, 448 : 228 - 237
  • [43] IMPROVING ABSTRACTIVE SUMMARIZATION WITH SEGMENT-AUGMENTED AND POSITION-AWARENESS
    Minh-Phuc Nguyen
    Nhi-Thao Tran
    AI IN COMPUTATIONAL LINGUISTICS, 2021, 189 : 167 - 174
  • [44] Abstractive Text Summarization Using Recurrent Neural Networks: Systematic Literature Review
    Ngoko, Israel Christian Tchouyaa
    Mukherjee, Amlan
    Kabaso, Boniface
    PROCEEDINGS OF THE 15TH INTERNATIONAL CONFERENCE ON INTELLECTUAL CAPITAL, KNOWLEDGE MANAGEMENT & ORGANISATIONAL LEARNING (ICICKM 2018), 2018, : 435 - 439
  • [45] Leveraging ParsBERT and Pretrained mT5 for Persian Abstractive Text Summarization
    Farahani, Mehrdad
    Gharachorloo, Mohammad
    Manthouri, Mohammad
    2021 26TH INTERNATIONAL COMPUTER CONFERENCE, COMPUTER SOCIETY OF IRAN (CSICC), 2021,
  • [46] Summary-aware attention for social media short text abstractive summarization
    Wang, Qianlong
    Ren, Jiangtao
    NEUROCOMPUTING, 2021, 425 : 290 - 299
  • [47] A Vision Enhanced Framework for Indonesian Multimodal Abstractive Text-Image Summarization
    Song, Yutao
    Lin, Nankai
    Li, Lingbao
    Jiang, Shengyi
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 61 - 66
  • [48] A novel semantic-enhanced generative adversarial network for abstractive text summarization
    Tham Vo
    Soft Computing, 2023, 27 : 6267 - 6280
  • [49] A novel semantic-enhanced generative adversarial network for abstractive text summarization
    Vo, Tham
    SOFT COMPUTING, 2023, 27 (10) : 6267 - 6280
  • [50] Improving Pointer-Generator Network with Keywords Information for Chinese Abstractive Summarization
    Jiang, Xiaoping
    Hu, Po
    Hou, Liwei
    Wang, Xia
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT I, 2018, 11108 : 464 - 474