A Computationally Efficient Weight Pruning Algorithm for Artificial Neural Network Classifiers

被引:0
作者
机构
[1] Thapar University,Electronics and Communication Engineering Department
来源
Arabian Journal for Science and Engineering | 2018年 / 43卷
关键词
Weight pruning; Artificial neural network; Backpropagation; Complexity penalty; Fisher information; Pattern classification;
D O I
暂无
中图分类号
学科分类号
摘要
A novel technique is being proposed to prune the weights of artificial neural networks (ANNs) while training with backpropagation algorithm. Iterative update of weights through gradient descent mechanism does not guarantee convergence in a specified number of epochs. Pruning of non-relevant weights not only reduces the computational complexity but also improves the classification performance. This algorithm first defines the “relevance” of initialized weights in a statistical sense by introducing a coefficient of dominance for each weight converging on a hidden node and subsequently employing the concept of complexity penalty. Based upon complexity penalty for each weight, a decision has been taken to either prune or retain the weight. It has been shown analytically that a weight with higher complexity penalty has a high degree of Fisher information which further implies its ability to capture the variations in the input set for better classification. Simulation experiments performed with five benchmark data sets reveal that ANNs trained after being pruned using the proposed technique exhibit higher convergence, lower execution time and higher success rate in the test phase and yields substantial reduction in computational resources. For complex architectures, early convergence was found to be directly correlated with percentage of weights pruned. The efficacy of the technique has been validated on several benchmark datasets having large diversity of attributes.
引用
收藏
页码:6787 / 6799
页数:12
相关论文
共 59 条
  • [11] Mabu S(2008)Multilayer perceptrons: approximate order and necessary number of hidden units IEEE Trans. Neural Netw. 19 836-844
  • [12] Shimada K(2014)A pruning algorithm with J. Zhejiang Univ. Sci. C 15 119-125
  • [13] Hirasawa K(2011) regularizer for extreme learning machine Neural Netw. 24 831-835
  • [14] Hu J(2012)Genetic algorithm pruning of probabilistic neural networks in medical disease estimation Neurocomputing 86 140-149
  • [15] Legg Robert J.(1990)An evolutionary constructive and pruning algorithm for artificial neural networks and its prediction applications NIPS 90 875-882
  • [16] Esser SK(2009)Generalization by weight-elimination with application to forecasting Neural Process. Lett. 30 71-87
  • [17] Merolla PA(2006)Neural network architecture selection: can function complexity help? Int. J. Bifurc. Chaos 16 2153-75
  • [18] Arthur JV(2013)A comparative study of information criteria for model selection Sci. Iran. 20 682-94
  • [19] Cassidy AS(2013)Effective pruning strategies for branch and bound Bayesian networks structure learning from data Neural Comput. Appl. 22 71-84
  • [20] Appuswamy R(undefined)A novel weight pruning method for MLP classifiers based on the MAXCORE principle undefined undefined undefined-undefined