Neural Plasticity (Jan 2020)

The Relationship between Sparseness and Energy Consumption of Neural Networks

  • Guanzheng Wang,
  • Rubin Wang,
  • Wanzeng Kong,
  • Jianhai Zhang

DOI
https://doi.org/10.1155/2020/8848901
Journal volume & issue
Vol. 2020

Abstract

Read online

About 50-80% of total energy is consumed by signaling in neural networks. A neural network consumes much energy if there are many active neurons in the network. If there are few active neurons in a neural network, the network consumes very little energy. The ratio of active neurons to all neurons of a neural network, that is, the sparseness, affects the energy consumption of a neural network. Laughlin’s studies show that the sparseness of an energy-efficient code depends on the balance between signaling and fixed costs. Laughlin did not give an exact ratio of signaling to fixed costs, nor did they give the ratio of active neurons to all neurons in most energy-efficient neural networks. In this paper, we calculated the ratio of signaling costs to fixed costs by the data from physiology experiments. The ratio of signaling costs to fixed costs is between 1.3 and 2.1. We calculated the ratio of active neurons to all neurons in most energy-efficient neural networks. The ratio of active neurons to all neurons in neural networks is between 0.3 and 0.4. Our results are consistent with the data from many relevant physiological experiments, indicating that the model used in this paper may meet neural coding under real conditions. The calculation results of this paper may be helpful to the study of neural coding.