An abstractive text summarization using deep learning in Assamese

被引:1
作者
Goutom P.J. [1 ]
Baruah N. [1 ]
Sonowal P. [1 ]
机构
[1] Dibrugarh University, Assam, Dibrugarh
关键词
Abstractive; Attention; NLP; Seq2Seq; Text summarization;
D O I
10.1007/s41870-023-01279-7
中图分类号
学科分类号
摘要
Abstractive text summarization with long short-term memory (LSTM) is a prominent strategy in natural language processing that tries to construct a compact and coherent summary of a given text by learning the semantic representation of the input text. In this study, we provide a seq2seq-based LSTM network model with attention to the encoder–decoder to construct a short sequence of words containing coherent and human-like created summaries, including crucial information from the original text. We obtained a dataset from Asomiya Pratidin, an Assamese online news website, including around 10,000 Assamese text articles and related human-written summaries. Our primary objective is to create an abstractive summarizer in Assamese while minimizing train loss. We reduced train loss to 0.3032 and produced fluent summaries during our study. © 2023, The Author(s), under exclusive licence to Bharati Vidyapeeth's Institute of Computer Applications and Management.
引用
收藏
页码:2365 / 2372
页数:7
相关论文
共 34 条
  • [1] Gambhir M., Gupta V., Recent automatic text summarization techniques: a survey, Artif Intell Rev, 47, 1, pp. 1-66, (2017)
  • [2] Mahalleh E.R., Gharehchopogh F.S., An automatic text summarization based on valuable sentences selection, Int J Inf Technol, 14, 6, pp. 2963-2969, (2022)
  • [3] Yadav A.K., Singh A., Dhiman M., Kaundal R., Verma A., Yadav D., Extractive text summarization using deep learning approach, Int J Inf Technol, 14, 5, pp. 2407-2415, (2022)
  • [4] (2022)
  • [5] Bahdanau D., Cho K., Bengio Y., Neural machine translation by jointly learning to align and translate, Arxiv Preprint Arxiv, 1409, (2014)
  • [6] Mandal S., Singh G.K., Pal A., Single document text summarization technique using optimal combination of cuckoo search algorithm, sentence scoring and sentiment score, Int J Inf Technol, 13, pp. 1805-1813, (2021)
  • [7] Nallapati R., Zhou B., Gulcehre C., Xiang B., Abstractive text summarization using sequence-to-sequence rnns and beyond, Arxiv Preprint Arxiv, 1602, (2016)
  • [8] See A., Liu P.J., Manning C.D., Get to the point: Summarization with pointer-generator networks., (2017)
  • [9] Rush A.M., Chopra S., Weston J., A neural attention model for abstractive sentence summarization., (2015)
  • [10] Wang L., Yao J., Tao Y., Zhong L., Liu W., Du Q., A reinforced topic-aware convolutional sequence-to-sequence model for abstractive text summarization, (2018)