Improving convolutional neural network for text classification by recursive data pruning

被引:28
作者
Li, Qi [2 ]
Li, Pengfei [1 ]
Mao, Kezhi [1 ]
Lo, Edmond Yat-Man [2 ,3 ]
机构
[1] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
[2] Nanyang Technol Univ, Inst Catastrophe Risk Management, Interdisciplinary Grad Programme, Singapore 639798, Singapore
[3] Nanyang Technol Univ, Sch Civil & Environm Engn, Singapore 639798, Singapore
关键词
Data pruning; Convolutional neural network; Text classification; SENTIMENT ANALYSIS;
D O I
10.1016/j.neucom.2020.07.049
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In spite of the state-of-the-art performance of deep neural networks, shallow neural networks are still the choice in applications with limited computing and memory resources. Convolutional neural network (CNN), in particular the one-convolutional-layer CNN, is a widely-used shallow neural network in natural language processing tasks such as text classification. However, it was found that CNNs may misfit to task-irrelevant words in dataset, which in turn leads to unsatisfactory performance. To alleviate this problem, attention mechanism can be integrated into CNN, but this takes up the limited resources. In this paper, we propose to address the misfitting problem from a novel angle - pruning task-irrelevant words from the dataset. The proposed method evaluates the performance of each convolutional filter based on its discriminative power of the feature generated at the pooling layer, and prunes words captured by the poorly-performed filters. Experiment results show that our proposed model significantly outperforms the CNN baseline model. Moreover, our proposed model produces performance similar to or better than the benchmark models (attention integrated CNNs) while demanding less parameters and FLOPs, and is therefore a choice model for resource limited scenarios, such as mobile applications. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:143 / 152
页数:10
相关论文
共 43 条
[1]  
Abualigah L.M.Q., 2019, FEATURE SELECTION EN
[2]   A new feature selection method to improve the document clustering using particle swarm optimization algorithm [J].
Abualigah, Laith Mohammad ;
Khader, Ahamad Tajudin ;
Hanandeh, Essam Said .
JOURNAL OF COMPUTATIONAL SCIENCE, 2018, 25 :456-466
[3]   Unsupervised text feature selection technique based on hybrid particle swarm optimization algorithm with genetic operators for the text clustering [J].
Abualigah, Laith Mohammad ;
Khader, Ahamad Tajudin .
JOURNAL OF SUPERCOMPUTING, 2017, 73 (11) :4773-4795
[4]  
[Anonymous], ARXIV150201710
[5]  
[Anonymous], ARXIV14085882
[6]   Structured Pruning of Deep Convolutional Neural Networks [J].
Anwar, Sajid ;
Hwang, Kyuyeon ;
Sung, Wonyong .
ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)
[7]  
Bahdanau D., ARXIV14090473
[8]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[9]  
Chen CS, 2014, CAN CON EL COMP EN
[10]  
Cogswell M., ARXIV151106068