Abstractive method of text summarization with sequence to sequence RNNs

被引:0
|
作者
Masum, Abu Kaisar Mohammad [1 ]
Abujar, Sheikh [1 ]
Talukder, Md Ashraful Islam [1 ]
Rabby, A. K. M. Shahariar Azad [1 ]
Hossain, Syed Akhter [1 ]
机构
[1] Daffodil Int Univ, Dept CSE, Dhaka, Bangladesh
来源
2019 10TH INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATION AND NETWORKING TECHNOLOGIES (ICCCNT) | 2019年
关键词
Text Processing; Word-Embedding; Missing Word Counting; Vocabulary Counting; Deep Learning; Bi-directional RNN; Encoding; Decoding;
D O I
暂无
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Text summarization is one of the famous problems in natural language processing and deep learning in recent years. Generally, text summarization contains a short note on a large text document. Our main purpose is to create a short, fluent and understandable abstractive summary of a text document. For making a good summarizer we have used amazon fine food reviews dataset, which is available on Kaggle. We have used reviews text descriptions as our input data, and generated a simple summary of that review descriptions as our output. To assist produce some extensive summary, we have used a bi-directional RNN with LSTM's in encoding layer and attention model in decoding layer. And we applied the sequence to sequence model to generate a short summary of food descriptions. There are some challenges when we working with abstractive text summarizer such as text processing, vocabulary counting, missing word counting, word embedding, the efficiency of the model or reduce value of loss and response machine fluent summary. In this paper, the main goal was increased the efficiency and reduce train loss of sequence to sequence model for making a better abstractive text summarizer. In our experiment, we've successfully reduced the training loss with a value of 0.036 and our abstractive text summarizer able to create a short summary of English to English text.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Bengali abstractive text summarization using sequence to sequence RNNs
    Talukder, Md Ashraful Islam
    Abujar, Sheikh
    Masum, Abu Kaisar Mohammad
    Faisal, Fahad
    Hossain, Syed Akhter
    2019 10TH INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATION AND NETWORKING TECHNOLOGIES (ICCCNT), 2019,
  • [2] Neural Abstractive Text Summarization with Sequence-to-Sequence Models
    Shi, Tian
    Keneshloo, Yaser
    Ramakrishnan, Naren
    Reddy, Chandan K.
    ACM/IMS Transactions on Data Science, 2021, 2 (01):
  • [3] Deep learning based sequence to sequence model for abstractive telugu text summarization
    G. L. Anand Babu
    Srinivasu Badugu
    Multimedia Tools and Applications, 2023, 82 : 17075 - 17096
  • [4] Towards neural abstractive clinical trial text summarization with sequence to sequence models
    Cintas, Celia
    Ogallo, William
    Walcott, Aisha
    Remy, Sekou L.
    Akinwande, Victor
    Osebe, Samuel
    2019 IEEE INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI), 2019, : 388 - 390
  • [5] Deep learning based sequence to sequence model for abstractive telugu text summarization
    Babu, G. L. Anand
    Badugu, Srinivasu
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (11) : 17075 - 17096
  • [6] Turkish abstractive text summarization using pretrained sequence-to-sequence models
    Baykara, Batuhan
    Gungor, Tunga
    NATURAL LANGUAGE ENGINEERING, 2023, 29 (05) : 1275 - 1304
  • [7] A Reinforced Topic-Aware Convolutional Sequence-to-Sequence Model for Abstractive Text Summarization
    Wang, Li
    Yao, Junlin
    Tao, Yunzhe
    Zhong, Li
    Liu, Wei
    Du, Qiang
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 4453 - 4460
  • [8] A Novel Deep Learning Attention Based Sequence to Sequence Model for Automatic Abstractive Text Summarization
    Abd Algani Y.M.
    International Journal of Information Technology, 2024, 16 (6) : 3597 - 3603
  • [9] Towards Sequence-to-Sequence Neural Model for Croatian Abstractive Summarization
    Davidovic, Vlatka
    Ipsic, Sanda Martincic
    CENTRAL EUROPEAN CONFERENCE ON INFORMATION AND INTELLIGENT SYSTEMS, CECIIS, 2023, : 309 - 315
  • [10] Abstractive Text Summarization: Enhancing Sequence-to-Sequence Models Using Word Sense Disambiguation and Semantic Content Generalization
    Kouris, Panagiotis
    Alexandridis, Georgios
    Stafylopatis, Andreas
    COMPUTATIONAL LINGUISTICS, 2021, 47 (04) : 813 - 859