Stochastic Net : Forming Deep Neural Networks via Stochastic Connectivity

被引:13
|
作者
Shafiee, Mohammad Javad [1 ]
Siva, Parthipan [2 ]
Wong, Alexander [1 ]
机构
[1] Univ Waterloo, Dept Syst Design Engn, Waterloo, ON N2L 3G1, Canada
[2] Aimetis Corp, Waterloo, ON N2L 4E9, Canada
来源
IEEE ACCESS | 2016年 / 4卷
基金
加拿大自然科学与工程研究理事会;
关键词
Deep convolutional nueral network; StochasticNet; random graph;
D O I
10.1109/ACCESS.2016.2551458
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep neural networks are a branch in machine learning that has seen a meteoric rise in popularity due to its powerful abilities to represent and model high-level abstractions in highly complex data. One area in deep neural networks that are ripe for exploration is neural connectivity formation. A pivotal study on the brain tissue of rats found that synaptic formation for specific functional connectivity in neocortical neural microcircuits can be surprisingly well modeled and predicted as a random formation. Motivated by this intriguing finding, we introduce the concept of StochasticNet where deep neural networks are formed via stochastic connectivity between neurons. As a result, any type of deep neural networks can be formed as a StochasticNet by allowing the neuron connectivity to be stochastic. Stochastic synaptic formations in a deep neural network architecture can allow for efficient utilization of neurons for performing specific tasks. To evaluate the feasibility of such a deep neural network architecture, we train a StochasticNet using four different image datasets (CIFAR-10, MNIST, SVHN, and STL-10). Experimental results show that a StochasticNet using less than half the number of neural connections as a conventional deep neural network achieves comparable accuracy and reduces over fitting on the CIFAR-10, MNIST, and SVHN data sets. Interestingly, StochasticNet with less than half the number of neural connections, achieved a higher accuracy (relative improvement in test error rate of similar to 6% compared to ConvNet) on the STL-10 data set than a conventional deep neural network. Finally, the StochasticNets have faster operational speeds while achieving better or similar accuracy performances.
引用
收藏
页码:1915 / 1924
页数:10
相关论文
共 50 条
  • [21] Stochastic Training of Neural Networks via Successive Convex Approximations
    Scardapane, Simone
    Di Lorenzo, Paolo
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (10) : 4947 - 4956
  • [22] Maximizing Fairness in Deep Neural Networks via Mode Connectivity
    Andreeva, Olga
    Almeida, Matthew
    Ding, Wei
    Crouter, Scott E.
    Chen, Ping
    IEEE INTELLIGENT SYSTEMS, 2022, 37 (03) : 36 - 44
  • [23] Deep Networks with Stochastic Depth
    Huang, Gao
    Sun, Yu
    Liu, Zhuang
    Sedra, Daniel
    Weinberger, Kilian Q.
    COMPUTER VISION - ECCV 2016, PT IV, 2016, 9908 : 646 - 661
  • [24] STOCHASTIC GRAPH NEURAL NETWORKS
    Gao, Zhan
    Isufi, Elvin
    Ribeiro, Alejandro
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 9080 - 9084
  • [25] Neural networks in stochastic mechanics
    Jorge E. Hurtado
    Archives of Computational Methods in Engineering, 2001, 8 : 303 - 342
  • [26] Stochastic Hopfield neural networks
    Hu, SG
    Liao, XX
    Mao, XR
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 2003, 36 (09): : 2235 - 2249
  • [27] On Accelerating Stochastic Neural Networks
    Ramakrishnan, Swathika
    Kudithipudi, Dhireesha
    PROCEEDINGS OF THE 4TH ACM INTERNATIONAL CONFERENCE ON NANOSCALE COMPUTING AND COMMUNICATION (ACM NANOCOM 2017), 2017,
  • [28] Neural networks in stochastic mechanics
    Hurtado, JE
    ARCHIVES OF COMPUTATIONAL METHODS IN ENGINEERING, 2001, 8 (03) : 303 - 342
  • [29] STOCHASTIC DYNAMICS OF NEURAL NETWORKS
    PERETTO, P
    NIEZ, JJ
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1986, 16 (01): : 73 - 83
  • [30] On stochastic neutral neural networks
    Zhang, YM
    Guo, L
    Wu, LY
    Feng, CB
    ADVANCES IN NEURAL NETWORKS - ISNN 2005, PT 1, PROCEEDINGS, 2005, 3496 : 69 - 74