Indonesian Abstractive Text Summarization Using Bidirectional Gated Recurrent Unit

被引:19
作者
Adelia, Rike [1 ]
Suyanto, Suyanto [1 ]
Wisesty, Untari Novia [1 ]
机构
[1] Telkom Univ, Sch Comp, Jl Telekomunikasi 01 Terusan Buah Batu, Bandung 40257, West Java, Indonesia
来源
4TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND COMPUTATIONAL INTELLIGENCE (ICCSCI 2019) : ENABLING COLLABORATION TO ESCALATE IMPACT OF RESEARCH RESULTS FOR SOCIETY | 2019年 / 157卷
关键词
abstractive text summarization; Bahasa Indonesia; bidirectional gated recurrent unit; recurrent neural network;
D O I
10.1016/j.procs.2019.09.017
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Abstractive text summarization is more challenging than the extractive one since it is performed by paraphrasing the entire contents of the text, which has a higher difficulty. But, it produces a more natural summary and higher inter-sentence cohesion. Recurrent Neural Network (RNN) has experienced success in summarizing abstractive texts for English and Chinese texts. The Bidirectional Gated Recurrent Unit (BiGRU) RNN architecture is used so that the resulted summaries are influenced by the surrounding words. In this research, such a method is applied for Bahasa Indonesia to improve the text summarizations those are commonly developed using some extractive methods with low inter-sentence cohesion. An evaluation on a dataset of Indonesian journal documents shows that the proposed model is capable of summarizing the overall contents of testing documents into some summaries with high similarities to the provided abstracts. The proposed model resulting success in understanding source text for generating summarization. (C) 2019 The Authors. Published by Elsevier B.V.
引用
收藏
页码:581 / 588
页数:8
相关论文
共 18 条
[1]  
Bahdanau D., 2015, INT C LEARN REPR
[2]  
Cao Z., 2017, FAITHFUL ORIGINAL FA, P4784
[3]  
Cho Kyunghyun, 2014, C EMPIRICAL METHODS, P1724
[4]  
Khan A, 2020, INT J MIN RECLAM ENV, V34, P149, DOI [10.20529/IJME.2018.075, 10.1080/17480930.2018.1532865]
[5]  
Khatri Chandra, 2018, Abstractive and Extractive Text Summarization using Document Context Vector and Recurrent Neural Networks
[6]  
Li Chenliang, 2018, P 2018 C N AM CHAPTE, V2, P55
[7]  
Li P., 2017, DEEP RECURRENT GENER, P2091
[8]  
Lin C.-Y., 2004, P WORKSH TEXT SUMM A, P74
[9]  
Maharani H., 2013, DOCUMENT SUMMARIZATI, V8
[10]  
Moratanch N., 2017, INT C COMP COMM SIGN, DOI [10.1109/ICCCSP.2017.7944061, DOI 10.1109/ICCCSP.2017.7944061]