Stochastic Net : Forming Deep Neural Networks via Stochastic Connectivity

被引:13
|
作者
Shafiee, Mohammad Javad [1 ]
Siva, Parthipan [2 ]
Wong, Alexander [1 ]
机构
[1] Univ Waterloo, Dept Syst Design Engn, Waterloo, ON N2L 3G1, Canada
[2] Aimetis Corp, Waterloo, ON N2L 4E9, Canada
来源
IEEE ACCESS | 2016年 / 4卷
基金
加拿大自然科学与工程研究理事会;
关键词
Deep convolutional nueral network; StochasticNet; random graph;
D O I
10.1109/ACCESS.2016.2551458
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep neural networks are a branch in machine learning that has seen a meteoric rise in popularity due to its powerful abilities to represent and model high-level abstractions in highly complex data. One area in deep neural networks that are ripe for exploration is neural connectivity formation. A pivotal study on the brain tissue of rats found that synaptic formation for specific functional connectivity in neocortical neural microcircuits can be surprisingly well modeled and predicted as a random formation. Motivated by this intriguing finding, we introduce the concept of StochasticNet where deep neural networks are formed via stochastic connectivity between neurons. As a result, any type of deep neural networks can be formed as a StochasticNet by allowing the neuron connectivity to be stochastic. Stochastic synaptic formations in a deep neural network architecture can allow for efficient utilization of neurons for performing specific tasks. To evaluate the feasibility of such a deep neural network architecture, we train a StochasticNet using four different image datasets (CIFAR-10, MNIST, SVHN, and STL-10). Experimental results show that a StochasticNet using less than half the number of neural connections as a conventional deep neural network achieves comparable accuracy and reduces over fitting on the CIFAR-10, MNIST, and SVHN data sets. Interestingly, StochasticNet with less than half the number of neural connections, achieved a higher accuracy (relative improvement in test error rate of similar to 6% compared to ConvNet) on the STL-10 data set than a conventional deep neural network. Finally, the StochasticNets have faster operational speeds while achieving better or similar accuracy performances.
引用
收藏
页码:1915 / 1924
页数:10
相关论文
共 50 条
  • [1] A New Stochastic Mutiplier for Deep Neural Networks
    Huh, Subin
    Yu, Joonsang
    Choi, Kiyoung
    PROCEEDINGS INTERNATIONAL SOC DESIGN CONFERENCE 2017 (ISOCC 2017), 2017, : 46 - 47
  • [2] On Local Entropy, Stochastic Control, and Deep Neural Networks
    Pavon, Michele
    IEEE CONTROL SYSTEMS LETTERS, 2023, 7 : 437 - 441
  • [3] Stochastic Battery Operations using Deep Neural Networks
    Chen, Yize
    Hashmi, Md Umar
    Deka, Deepjyoti
    Chertkov, Michael
    2019 IEEE POWER & ENERGY SOCIETY INNOVATIVE SMART GRID TECHNOLOGIES CONFERENCE (ISGT), 2019,
  • [4] Bridgeout: Stochastic Bridge Regularization for Deep Neural Networks
    Khan, Najeeb
    Shah, Jawad
    Stavness, Ian
    IEEE ACCESS, 2018, 6 : 42961 - 42970
  • [5] Distributed regularized stochastic configuration networks via the elastic net
    Lijie Zhao
    Shida Zou
    Mingzhong Huang
    Guogang Wang
    Neural Computing and Applications, 2021, 33 : 3281 - 3297
  • [6] Distributed regularized stochastic configuration networks via the elastic net
    Zhao, Lijie
    Zou, Shida
    Huang, Mingzhong
    Wang, Guogang
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (08): : 3281 - 3297
  • [7] Stochastic Deep Networks
    de Bie, Gwendoline
    Peyre, Gabriel
    Cuturi, Marco
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [8] STOCHASTIC NEURAL NETWORKS
    WONG, E
    ALGORITHMICA, 1991, 6 (03) : 466 - 478
  • [9] Regularizing Neural Networks via Stochastic Branch Layers
    Park, Wonpyo
    Seo, Paul Hongsuck
    Han, Bohyung
    Cho, Minsu
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 101, 2019, 101 : 678 - 693
  • [10] Preconditioned Stochastic Gradient Langevin Dynamics for Deep Neural Networks
    Li, Chunyuan
    Chen, Changyou
    Carlson, David
    Carin, Lawrence
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1788 - 1794