IEEE Access (Jan 2016)

StochasticNet: Forming Deep Neural Networks via Stochastic Connectivity

  • Mohammad Javad Shafiee,
  • Parthipan Siva,
  • Alexander Wong

DOI
https://doi.org/10.1109/ACCESS.2016.2551458
Journal volume & issue
Vol. 4
pp. 1915 – 1924

Abstract

Read online

Deep neural networks are a branch in machine learning that has seen a meteoric rise in popularity due to its powerful abilities to represent and model high-level abstractions in highly complex data. One area in deep neural networks that are ripe for exploration is neural connectivity formation. A pivotal study on the brain tissue of rats found that synaptic formation for specific functional connectivity in neocortical neural microcircuits can be surprisingly well modeled and predicted as a random formation. Motivated by this intriguing finding, we introduce the concept of StochasticNet where deep neural networks are formed via stochastic connectivity between neurons. As a result, any type of deep neural networks can be formed as a StochasticNet by allowing the neuron connectivity to be stochastic. Stochastic synaptic formations in a deep neural network architecture can allow for efficient utilization of neurons for performing specific tasks. To evaluate the feasibility of such a deep neural network architecture, we train a StochasticNet using four different image datasets (CIFAR-10, MNIST, SVHN, and STL-10). Experimental results show that a StochasticNet using less than half the number of neural connections as a conventional deep neural network achieves comparable accuracy and reduces overfitting on the CIFAR-10, MNIST, and SVHN data sets. Interestingly, StochasticNet with less than half the number of neural connections, achieved a higher accuracy (relative improvement in test error rate of ~6% compared to ConvNet) on the STL-10 data set than a conventional deep neural network. Finally, the StochasticNets have faster operational speeds while achieving better or similar accuracy performances.

Keywords