Training Behavior of Sparse Neural Network Topologies

被引:3
作者
Alford, Simon [1 ]
Robinett, Ryan [1 ]
Milechin, Lauren [2 ]
Kepner, Jeremy [1 ,3 ]
机构
[1] MIT Math Dept, Cambridge, MA 02142 USA
[2] MIT Dept Earth Atmospher & Planetary Sci, Cambridge, MA USA
[3] MIT Lincoln Lab, Supercomp Ctr, Lexington, MA USA
来源
2019 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC) | 2019年
关键词
neural network; pruning; sparse; training;
D O I
10.1109/hpec.2019.8916385
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Improvements in the performance of deep neural networks have often come through the design of larger and more complex networks. As a result, fast memory is a significant limiting factor in our ability to improve network performance. One approach to overcoming this limit is the design of sparse neural networks, which can be both very large and efficiently trained. In this paper we experiment training on sparse neural network topologies. We test pruning-based topologies, which are derived from an initially dense network whose connections are pruned, as well as RadiX-Nets, a class of network topologies with proven connectivity and sparsity properties. Results show that sparse networks obtain accuracies comparable to dense networks, but extreme levels of sparsity cause instability in training, which merits further study.
引用
收藏
页数:6
相关论文
共 23 条
  • [1] [Anonymous], P 3 INT C LEARNING R
  • [2] [Anonymous], MIT URTC
  • [3] [Anonymous], 2017, ARXIV171108757
  • [4] [Anonymous], 2018, Mathematics of big data: Spreadsheets, databases, matrices, and graphs
  • [5] [Anonymous], HPEC
  • [6] [Anonymous], 2017, ARXIV170405119
  • [7] [Anonymous], ADV NEURAL INFORM PR
  • [8] [Anonymous], 2015, PROCIEEE CONFCOMPUT, DOI DOI 10.1109/CVPR.2015.7298594
  • [9] Equal Numbers of Neuronal and Nonneuronal Cells Make the Human Brain an Isometrically Scaled-Up Primate Brain
    Azevedo, Frederico A. C.
    Carvalho, Ludmila R. B.
    Grinberg, Lea T.
    Farfel, Jose Marcelo
    Ferretti, Renata E. L.
    Leite, Renata E. P.
    Jacob Filho, Wilson
    Lent, Roberto
    Herculano-Houzel, Suzana
    [J]. JOURNAL OF COMPARATIVE NEUROLOGY, 2009, 513 (05) : 532 - 541
  • [10] Babaeizadeh M., 2016, A simple yet effective method to prune dense layers of neural networks