Arabic text summarization using deep learning approach

被引:25
作者
Al-Maleh, Molham [1 ]
Desouki, Said [2 ]
机构
[1] Higher Inst Appl Sci & Technol, Fac Informat Technol, Damascus, Syria
[2] Arab Int Univ, Fac Informat & Commun Engn, Damascus, Syria
关键词
Natural language processing; Text summarization; Deep learning; Big data; Sequence-to-sequence framework;
D O I
10.1186/s40537-020-00386-7
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Natural language processing has witnessed remarkable progress with the advent of deep learning techniques. Text summarization, along other tasks like text translation and sentiment analysis, used deep neural network models to enhance results. The new methods of text summarization are subject to a sequence-to-sequence framework of encoder-decoder model, which is composed of neural networks trained jointly on both input and output. Deep neural networks take advantage of big datasets to improve their results. These networks are supported by the attention mechanism, which can deal with long texts more efficiently by identifying focus points in the text. They are also supported by the copy mechanism that allows the model to copy words from the source to the summary directly. In this research, we are re-implementing the basic summarization model that applies the sequence-to-sequence framework on the Arabic language, which has not witnessed the employment of this model in the text summarization before. Initially, we build an Arabic data set of summarized article headlines. This data set consists of approximately 300 thousand entries, each consisting of an article introduction and the headline corresponding to this introduction. We then apply baseline summarization models to the previous data set and compare the results using the ROUGE scale.
引用
收藏
页数:17
相关论文
共 32 条
  • [1] AlSanie W., 2005, THESIS
  • [2] [Anonymous], 2010, P 2010 C EMP METH NA
  • [3] [Anonymous], 2014, Advances in neural information processing systems
  • [4] [Anonymous], 2012, WORLD COMPUT SCI INF
  • [5] [Anonymous], 2016, P C ASS MACH TRANSL
  • [6] [Anonymous], 2008, COLING
  • [7] [Anonymous], 2015, ARXIV150900685, DOI DOI 10.18653/V1/D15-1044
  • [8] Assoc Computat L., 2018, ARXIV
  • [9] Azmi A., 2009, 2009 INT C NAT LANG, P1
  • [10] Bahdanau D., 2014, 3 INT C LEARN REPR