Novel multi-domain attention for abstractive summarisation

被引:7
|
作者
Qu, Chunxia [1 ]
Lu, Ling [1 ]
Wang, Aijuan [1 ]
Yang, Wu [1 ]
Chen, Yinong [2 ]
机构
[1] Chongqing Univ Technol, Coll Comp Sci & Engn, Chongqing 400050, Peoples R China
[2] Arizona State Univ, Sch Comp & Augmented Intelligence, Tempe, AZ USA
关键词
abstracting; abstractive summarisation; attention mechanism; Bi-LSTM; convolutional neural nets; coverage mechanism; pointer network; recurrent neural nets; text analysis; word processing;
D O I
10.1049/cit2.12117
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary, and the summary generated by models lacks the cover of the subject of source document due to models' small perspective. In order to make up these disadvantages, a multi-domain attention pointer (MDA-Pointer) abstractive summarisation model is proposed in this work. First, the model uses bidirectional long short-term memory to encode, respectively, the word and sentence sequence of source document for obtaining the semantic representations at word and sentence level. Furthermore, the multi-domain attention mechanism between the semantic representations and the summary word is established, and the proposed model can generate summary words under the proposed attention mechanism based on the words and sentences. Then, the words are extracted from the vocabulary or the original word sequences through the pointer network to form the summary, and the coverage mechanism is introduced, respectively, into word and sentence level to reduce the redundancy of summary content. Finally, experiment validation is conducted on the convolutional neural network/Daily Mail dataset. ROUGE evaluation indexes of the model without and with the coverage mechanism are improved respectively, and the results verify the validation of model proposed by this paper.
引用
收藏
页码:796 / 806
页数:11
相关论文
共 50 条
  • [41] A novel Topology Aggregation approach for shared protection in multi-domain networks
    Dieu-Linh Truong
    Jaumard, Brigitte
    OPTICAL SWITCHING AND NETWORKING, 2012, 9 (02) : 81 - 96
  • [42] RMMDI: A Novel Framework for Role Mining Based on the Multi-Domain Information
    Bai, Wei
    Pan, Zhisong
    Guo, Shize
    Chen, Zhe
    SECURITY AND COMMUNICATION NETWORKS, 2019,
  • [43] Design and Implementation of a Novel Multi-Domain Management for Automotive Power Nets
    Tippe, Laurenz
    Oberloher, Alberto de Vergara
    Ebnicher, Michael
    Froschl, Joachim
    Herzog, Hans-Georg
    2022 IEEE/AIAA TRANSPORTATION ELECTRIFICATION CONFERENCE AND ELECTRIC AIRCRAFT TECHNOLOGIES SYMPOSIUM (ITEC+EATS 2022), 2022, : 147 - 154
  • [44] Multi-layer Multi-domain Networking
    Lehman, Thomas
    Yang, Xi
    2008 7TH INTERNATIONAL CONFERENCE ON THE OPTICAL INTERNET (COIN), 2008, : 171 - 172
  • [45] Improving Biomedical Abstractive Summarisation with Knowledge Aggregation from Citation Papers
    Tang, Chen
    Wang, Shun
    Goldsack, Tomas
    Lin, Chenghua
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 606 - 618
  • [46] Multi-task learning with graph attention networks for multi-domain task-oriented dialogue systems
    Zhao, Meng
    Wang, Lifang
    Jiang, Zejun
    Li, Ronghan
    Lu, Xinyu
    Hu, Zhongtian
    KNOWLEDGE-BASED SYSTEMS, 2023, 259
  • [47] MCR: Multilayer cross-fusion with reconstructor for multimodal abstractive summarisation
    Yuan, Jingshu
    Yun, Jing
    Zheng, Bofei
    Jiao, Lei
    Liu, Limin
    IET COMPUTER VISION, 2023, 17 (04) : 389 - 403
  • [48] Revisiting Multi-Domain Machine Translation
    MinhQuang Pham
    Crego, Josep Maria
    Yvon, Francois
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2021, 9 : 17 - 35
  • [49] Analysis of Multi-domain Protein Dynamics
    Roy, Amitava
    PROTEIN SCIENCE, 2016, 25 : 44 - 45
  • [50] Management mechanism for multi-domain strategy
    Duan, Li-Juan
    Liu, Yan
    Yang, Zhen
    Lai, Ying-Xu
    Beijing Gongye Daxue Xuebao/Journal of Beijing University of Technology, 2010, 36 (SUPPL. 2): : 49 - 53