A Computationally Efficient Weight Pruning Algorithm for Artificial Neural Network Classifiers

被引:0
作者
机构
[1] Thapar University,Electronics and Communication Engineering Department
来源
Arabian Journal for Science and Engineering | 2018年 / 43卷
关键词
Weight pruning; Artificial neural network; Backpropagation; Complexity penalty; Fisher information; Pattern classification;
D O I
暂无
中图分类号
学科分类号
摘要
A novel technique is being proposed to prune the weights of artificial neural networks (ANNs) while training with backpropagation algorithm. Iterative update of weights through gradient descent mechanism does not guarantee convergence in a specified number of epochs. Pruning of non-relevant weights not only reduces the computational complexity but also improves the classification performance. This algorithm first defines the “relevance” of initialized weights in a statistical sense by introducing a coefficient of dominance for each weight converging on a hidden node and subsequently employing the concept of complexity penalty. Based upon complexity penalty for each weight, a decision has been taken to either prune or retain the weight. It has been shown analytically that a weight with higher complexity penalty has a high degree of Fisher information which further implies its ability to capture the variations in the input set for better classification. Simulation experiments performed with five benchmark data sets reveal that ANNs trained after being pruned using the proposed technique exhibit higher convergence, lower execution time and higher success rate in the test phase and yields substantial reduction in computational resources. For complex architectures, early convergence was found to be directly correlated with percentage of weights pruned. The efficacy of the technique has been validated on several benchmark datasets having large diversity of attributes.
引用
收藏
页码:6787 / 6799
页数:12
相关论文
共 59 条
  • [1] Riedmiller M(1994)Advanced supervised learning in multi-layer perceptrons—from backpropagation to adaptive learning algorithms Comput. Stand. Interfaces 16 265-278
  • [2] Srivastava AK(1998)Exploring neuro-genetic processing of electronic nose data Microelectron. J. 29 921-931
  • [3] Shukla KK(2015)Pruning extreme learning machines using the successive projections algorithm IEEE Latin Am. Trans. 13 3974-3979
  • [4] Srivastava SK(2015)A pruning method based on weight variation information for feedforward neural networks IFAC-PapersOnLine 48 221-226
  • [5] Mesquita DP(2009)Enhancing the generalization ability of neural networks through controlling the hidden layers Appl. Soft Comput. 9 404-414
  • [6] Rodrigues LR(2010)The SAGE Handbook of Spatial Analysis, edited by A. Stewart Fotheringham and Peter A. Rogerson Journal of Regional Science 50 1012-1013
  • [7] Galvao RK(2016)Convolutional networks for fast, energy-efficient neuromorphic computing Nat. Acad. Sci. 113 11441-11446
  • [8] Tong Z(1993)Pruning algorithms—a survey IEEE Trans. Neural Netw. 4 740-747
  • [9] Tanaka G(2000)Constructive neural-network learning algorithms for pattern classification IEEE Trans. Neural Netw. 11 436-51
  • [10] Wan W(2005)Geometrical interpretation and architecture selection of MLP IEEE Trans. Neural Netw. 16 84-96