A Novel Deep Learning Attention Based Sequence to Sequence Model for Automatic Abstractive Text Summarization

被引:0
作者
Abd Algani Y.M. [1 ]
机构
[1] Department of Mathematics, The Arab Academic College for Education in Israel-Haifa, Haifa
关键词
Abstractive text summarization; Attention mechanism; LSTM; Sequence to sequence model; Word embedded layer;
D O I
10.1007/s41870-024-01934-7
中图分类号
学科分类号
摘要
Abstractive text summarization is one of the trending topics in the field of natural language processing (NLP). In this type of text summarization, new sentences are generated from the original text, irrespective of whether these sentences exist in the original corpus. There are several existing sequence-to-sequence models for performing abstractive text summarization, but they are equipped with challenges such as redundancy, lack of vocabulary distribution, and irrelevant results. To overcome this, the given paper introduces a novel attention-based sequence-to-sequence model for automatic summarization of abstractive text. The proposed sequence-to-sequence model comprises a Bi-LSTM (bidirectional long-short-term memory) encoder, a unidirectional LSTM decoder, an attention mechanism, and a word embedding layer to compute the probability distribution of each word in the original document. This leads to the feature extraction of words semantically, which results in higher relevance. The performance of the proposed model is validated by taking the CNN/Daily Mail dataset into consideration and assessed using Recall-Oriented Understudy for Gisting Evaluation (ROUGE) metrics. The results show that the proposed model achieves higher ROUGE-1, ROUGE-2, and ROUGE-L values as compared to the existing baseline models. © Bharati Vidyapeeth's Institute of Computer Applications and Management 2024.
引用
收藏
页码:3597 / 3603
页数:6
相关论文
共 21 条
[1]  
Gangundi R., Sridhar R., IWM-LSTM encoder for abstractive text summarization, Multimed Tools Appl, (2024)
[2]  
Alami Merrouni Z., Frikh B., Ouhbi B., EXABSUM: a new text summarization approach for generating extractive and abstractive summaries, J Big Data, 10, (2023)
[3]  
Gupta P., Nigam S., Singh R., A Ranking based Language Model for Automatic Extractive Text Summarization, 2022 First International Conference on Artificial Intelligence Trends and Pattern Recognition (ICAITPR), Hyderabad, India, 2022, pp. 1-5, (2022)
[4]  
Khan B., Shah Z.A., Usman M., Khan I., Niazi B., Exploring the Landscape of Automatic Text Summarization: A Comprehensive Survey, IEEE Access, 11, pp. 109819-109840, (2023)
[5]  
Moratanch N., Chitrakala S., Anaphora resolved abstractive text summarization (AR-ATS) system, Multimed Tools Appl, 82, pp. 4569-4597, (2023)
[6]  
Madhuri J.N., Ganesh Kumar R., Extractive Text Summarization Using Sentence Ranking," 2019 International Conference on Data Science and Communication (IconDSC), Bangalore, India, 2019, pp. 1-3, (2019)
[7]  
Mahalleh E.R., Gharehchopogh F.S., An automatic text summarization based on valuable sentences selection, Int j inf tecnol, 14, pp. 2963-2969, (2022)
[8]  
Yadav A.K., Ranvijay R., Yadav R.S., Et al., Large text document summarization based on an enhanced fuzzy logic approach, Int J Inf Tecnol, (2023)
[9]  
Radarapu R., Gopal A.S.S., Et al., Video summarization and captioning using dynamic mode decomposition for surveillance, Int J Inf Tecnol, 13, pp. 1927-1936, (2021)
[10]  
Wason R., Deep Learning: Evolution and Expansion, Cogn Syst Res, (2018)