Neural Network Structure Optimization by Simulated Annealing

被引:19
作者
Kuo, Chun Lin [1 ]
Kuruoglu, Ercan Engin [1 ]
Chan, Wai Kin Victor [1 ]
机构
[1] Tsinghua Berkeley Shenzhen Inst, Shenzhen 518071, Peoples R China
基金
中国国家自然科学基金;
关键词
neural network; pruning; structure optimization; heuristics; simulated annealing; ALGORITHM;
D O I
10.3390/e24030348
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
A critical problem in large neural networks is over parameterization with a large number of weight parameters, which limits their use on edge devices due to prohibitive computational power and memory/storage requirements. To make neural networks more practical on edge devices and real-time industrial applications, they need to be compressed in advance. Since edge devices cannot train or access trained networks when internet resources are scarce, the preloading of smaller networks is essential. Various works in the literature have shown that the redundant branches can be pruned strategically in a fully connected network without sacrificing the performance significantly. However, majority of these methodologies need high computational resources to integrate weight training via the back-propagation algorithm during the process of network compression. In this work, we draw attention to the optimization of the network structure for preserving performance despite compression by pruning aggressively. The structure optimization is performed using the simulated annealing algorithm only, without utilizing back-propagation for branch weight training. Being a heuristic-based, non-convex optimization method, simulated annealing provides a globally near-optimal solution to this NP-hard problem for a given percentage of branch pruning. Our simulation results have shown that simulated annealing can significantly reduce the complexity of a fully connected network while maintaining the performance without the help of back-propagation.
引用
收藏
页数:18
相关论文
共 50 条
[31]   Chaotic Simulated Annealing by a Neural Network with a Variable Delay: Design and Application [J].
Chen, Shyan-Shiou .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (10) :1557-1565
[32]   Improving the performance of simulated annealing in structural optimization [J].
Hasancebi, Oguzhan ;
Carbas, Serdar ;
Saka, Mehmet Polat .
STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION, 2010, 41 (02) :189-203
[33]   Optimization of operation sequencing in CAPP using simulated annealing technique (SAT) [J].
Nallakumarasamy, G. ;
Srinivasan, P. S. S. ;
Raja, K. Venkatesh ;
Malayalamurthi, R. .
INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2011, 54 (5-8) :721-728
[34]   Optimization of network parameters through redesign operations within simulated annealing [J].
Habib, Sami J. ;
Marimuthu, Paulvanna N. .
KUWAIT JOURNAL OF SCIENCE & ENGINEERING, 2011, 38 (1B) :167-190
[35]   Adaptive Optimization of Dynamic Heterogeneous Network Topologies: A Simulated Annealing Methodology [J].
Zhuo, Ming ;
Yang, Peng ;
Chen, Junyi ;
Liu, Leyuan ;
Liu, Chenrui .
ARTIFICIAL INTELLIGENCE AND SECURITY, ICAIS 2022, PT II, 2022, 13339 :587-612
[36]   Simulated annealing technique in discrete fracture network inversion: optimizing the optimization [J].
Nam H. Tran .
Computational Geosciences, 2007, 11 :249-260
[37]   Simulated annealing technique in discrete fracture network inversion: optimizing the optimization [J].
Tran, Nam H. .
COMPUTATIONAL GEOSCIENCES, 2007, 11 (03) :249-260
[38]   OPTIMIZATION USING SIMULATED ANNEALING [J].
BROOKS, SP ;
MORGAN, BJT .
STATISTICIAN, 1995, 44 (02) :241-257
[39]   Simulated annealing for convex optimization [J].
Kalai, Adam Tauman ;
Vempala, Santosh .
MATHEMATICS OF OPERATIONS RESEARCH, 2006, 31 (02) :253-266
[40]   GLOBAL OPTIMIZATION AND SIMULATED ANNEALING [J].
DEKKERS, A ;
AARTS, E .
MATHEMATICAL PROGRAMMING, 1991, 50 (03) :367-393